mirror of
https://github.com/openclaw/openclaw.git
synced 2026-04-18 21:17:27 +00:00
revert(docs): undo markdownlint autofix churn
This commit is contained in:
@@ -131,9 +131,9 @@ launchctl bootstrap gui/$(id -u) ~/Library/LaunchAgents/com.claude-max-api.plist
|
||||
|
||||
## Links
|
||||
|
||||
- **npm:** [https://www.npmjs.com/package/claude-max-api-proxy](https://www.npmjs.com/package/claude-max-api-proxy)
|
||||
- **GitHub:** [https://github.com/atalovesyou/claude-max-api-proxy](https://github.com/atalovesyou/claude-max-api-proxy)
|
||||
- **Issues:** [https://github.com/atalovesyou/claude-max-api-proxy/issues](https://github.com/atalovesyou/claude-max-api-proxy/issues)
|
||||
- **npm:** https://www.npmjs.com/package/claude-max-api-proxy
|
||||
- **GitHub:** https://github.com/atalovesyou/claude-max-api-proxy
|
||||
- **Issues:** https://github.com/atalovesyou/claude-max-api-proxy/issues
|
||||
|
||||
## Notes
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ For Anthropic models, use your Anthropic API key.
|
||||
openclaw onboard --auth-choice cloudflare-ai-gateway-api-key
|
||||
```
|
||||
|
||||
1. Set a default model:
|
||||
2. Set a default model:
|
||||
|
||||
```json5
|
||||
{
|
||||
|
||||
@@ -15,8 +15,8 @@ When enabled, OpenClaw uploads the audio file to Deepgram and injects the transc
|
||||
into the reply pipeline (`{{Transcript}}` + `[Audio]` block). This is **not streaming**;
|
||||
it uses the pre-recorded transcription endpoint.
|
||||
|
||||
Website: [https://deepgram.com](https://deepgram.com)
|
||||
Docs: [https://developers.deepgram.com](https://developers.deepgram.com)
|
||||
Website: https://deepgram.com
|
||||
Docs: https://developers.deepgram.com
|
||||
|
||||
## Quick start
|
||||
|
||||
@@ -26,7 +26,7 @@ Docs: [https://developers.deepgram.com](https://developers.deepgram.com)
|
||||
DEEPGRAM_API_KEY=dg_...
|
||||
```
|
||||
|
||||
1. Enable the provider:
|
||||
2. Enable the provider:
|
||||
|
||||
```json5
|
||||
{
|
||||
|
||||
@@ -179,7 +179,7 @@ Use the interactive config wizard to set MiniMax without editing JSON:
|
||||
- Model refs are `minimax/<model>`.
|
||||
- Coding Plan usage API: `https://api.minimaxi.com/v1/api/openplatform/coding_plan/remains` (requires a coding plan key).
|
||||
- Update pricing values in `models.json` if you need exact cost tracking.
|
||||
- Referral link for MiniMax Coding Plan (10% off): [https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb&source=link](https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb&source=link)
|
||||
- Referral link for MiniMax Coding Plan (10% off): https://platform.minimax.io/subscribe/coding-plan?code=DbXJTRClnb&source=link
|
||||
- See [/concepts/model-providers](/concepts/model-providers) for provider rules.
|
||||
- Use `openclaw models list` and `openclaw models set minimax/MiniMax-M2.1` to switch.
|
||||
|
||||
|
||||
@@ -15,14 +15,14 @@ Kimi Coding with `kimi-coding/k2p5`.
|
||||
|
||||
Current Kimi K2 model IDs:
|
||||
|
||||
{/_moonshot-kimi-k2-ids:start_/ && null}
|
||||
{/_ moonshot-kimi-k2-ids:start _/ && null}
|
||||
|
||||
- `kimi-k2.5`
|
||||
- `kimi-k2-0905-preview`
|
||||
- `kimi-k2-turbo-preview`
|
||||
- `kimi-k2-thinking`
|
||||
- `kimi-k2-thinking-turbo`
|
||||
{/_moonshot-kimi-k2-ids:end_/ && null}
|
||||
{/_ moonshot-kimi-k2-ids:end _/ && null}
|
||||
|
||||
```bash
|
||||
openclaw onboard --auth-choice moonshot-api-key
|
||||
|
||||
@@ -12,7 +12,7 @@ Ollama is a local LLM runtime that makes it easy to run open-source models on yo
|
||||
|
||||
## Quick start
|
||||
|
||||
1. Install Ollama: [https://ollama.ai](https://ollama.ai)
|
||||
1. Install Ollama: https://ollama.ai
|
||||
|
||||
2. Pull a model:
|
||||
|
||||
@@ -26,7 +26,7 @@ ollama pull qwen2.5-coder:32b
|
||||
ollama pull deepseek-r1:32b
|
||||
```
|
||||
|
||||
1. Enable Ollama for OpenClaw (any value works; Ollama doesn't require a real key):
|
||||
3. Enable Ollama for OpenClaw (any value works; Ollama doesn't require a real key):
|
||||
|
||||
```bash
|
||||
# Set environment variable
|
||||
@@ -36,7 +36,7 @@ export OLLAMA_API_KEY="ollama-local"
|
||||
openclaw config set models.providers.ollama.apiKey "ollama-local"
|
||||
```
|
||||
|
||||
1. Use Ollama models:
|
||||
4. Use Ollama models:
|
||||
|
||||
```json5
|
||||
{
|
||||
|
||||
@@ -22,7 +22,7 @@ The [Vercel AI Gateway](https://vercel.com/ai-gateway) provides a unified API to
|
||||
openclaw onboard --auth-choice ai-gateway-api-key
|
||||
```
|
||||
|
||||
1. Set a default model:
|
||||
2. Set a default model:
|
||||
|
||||
```json5
|
||||
{
|
||||
|
||||
Reference in New Issue
Block a user