mirror of
https://github.com/openclaw/openclaw.git
synced 2026-04-18 22:57:28 +00:00
feat: Provider/Mistral full support for Mistral on OpenClaw 🇫🇷 (#23845)
* Onboard: add Mistral auth choice and CLI flags * Onboard/Auth: add Mistral provider config defaults * Auth choice: wire Mistral API-key flow * Onboard non-interactive: support --mistral-api-key * Media understanding: add Mistral Voxtral audio provider * Changelog: note Mistral onboarding and media support * Docs: add Mistral provider and onboarding/media references * Tests: cover Mistral media registry/defaults and auth mapping * Memory: add Mistral embeddings provider support * Onboarding: refresh Mistral model metadata * Docs: document Mistral embeddings and endpoints * Memory: persist Mistral embedding client state in managers * Memory: add regressions for mistral provider wiring * Gateway: add live tool probe retry helper * Gateway: cover live tool probe retry helper * Gateway: retry malformed live tool-read probe responses * Memory: support plain-text batch error bodies * Tests: add Mistral Voxtral live transcription smoke * Docs: add Mistral live audio test command * Revert: remove Mistral live voice test and docs entry * Onboard: re-export Mistral default model ref from models * Changelog: credit joeVenner for Mistral work * fix: include Mistral in auto audio key fallback * Update CHANGELOG.md * Update CHANGELOG.md --------- Co-authored-by: Shakker <shakkerdroid@gmail.com>
This commit is contained in:
@@ -321,13 +321,14 @@ Options:
|
||||
- `--non-interactive`
|
||||
- `--mode <local|remote>`
|
||||
- `--flow <quickstart|advanced|manual>` (manual is an alias for advanced)
|
||||
- `--auth-choice <setup-token|token|chutes|openai-codex|openai-api-key|openrouter-api-key|ai-gateway-api-key|moonshot-api-key|moonshot-api-key-cn|kimi-code-api-key|synthetic-api-key|venice-api-key|gemini-api-key|zai-api-key|apiKey|minimax-api|minimax-api-lightning|opencode-zen|custom-api-key|skip>`
|
||||
- `--auth-choice <setup-token|token|chutes|openai-codex|openai-api-key|openrouter-api-key|ai-gateway-api-key|moonshot-api-key|moonshot-api-key-cn|kimi-code-api-key|synthetic-api-key|venice-api-key|gemini-api-key|zai-api-key|mistral-api-key|apiKey|minimax-api|minimax-api-lightning|opencode-zen|custom-api-key|skip>`
|
||||
- `--token-provider <id>` (non-interactive; used with `--auth-choice token`)
|
||||
- `--token <token>` (non-interactive; used with `--auth-choice token`)
|
||||
- `--token-profile-id <id>` (non-interactive; default: `<provider>:manual`)
|
||||
- `--token-expires-in <duration>` (non-interactive; e.g. `365d`, `12h`)
|
||||
- `--anthropic-api-key <key>`
|
||||
- `--openai-api-key <key>`
|
||||
- `--mistral-api-key <key>`
|
||||
- `--openrouter-api-key <key>`
|
||||
- `--ai-gateway-api-key <key>`
|
||||
- `--moonshot-api-key <key>`
|
||||
|
||||
@@ -56,6 +56,14 @@ openclaw onboard --non-interactive \
|
||||
# --auth-choice zai-cn
|
||||
```
|
||||
|
||||
Non-interactive Mistral example:
|
||||
|
||||
```bash
|
||||
openclaw onboard --non-interactive \
|
||||
--auth-choice mistral-api-key \
|
||||
--mistral-api-key "$MISTRAL_API_KEY"
|
||||
```
|
||||
|
||||
Flow notes:
|
||||
|
||||
- `quickstart`: minimal prompts, auto-generates a gateway token.
|
||||
|
||||
@@ -105,7 +105,8 @@ Defaults:
|
||||
2. `openai` if an OpenAI key can be resolved.
|
||||
3. `gemini` if a Gemini key can be resolved.
|
||||
4. `voyage` if a Voyage key can be resolved.
|
||||
5. Otherwise memory search stays disabled until configured.
|
||||
5. `mistral` if a Mistral key can be resolved.
|
||||
6. Otherwise memory search stays disabled until configured.
|
||||
- Local mode uses node-llama-cpp and may require `pnpm approve-builds`.
|
||||
- Uses sqlite-vec (when available) to accelerate vector search inside SQLite.
|
||||
|
||||
@@ -114,7 +115,9 @@ resolves keys from auth profiles, `models.providers.*.apiKey`, or environment
|
||||
variables. Codex OAuth only covers chat/completions and does **not** satisfy
|
||||
embeddings for memory search. For Gemini, use `GEMINI_API_KEY` or
|
||||
`models.providers.google.apiKey`. For Voyage, use `VOYAGE_API_KEY` or
|
||||
`models.providers.voyage.apiKey`. When using a custom OpenAI-compatible endpoint,
|
||||
`models.providers.voyage.apiKey`. For Mistral, use `MISTRAL_API_KEY` or
|
||||
`models.providers.mistral.apiKey`.
|
||||
When using a custom OpenAI-compatible endpoint,
|
||||
set `memorySearch.remote.apiKey` (and optional `memorySearch.remote.headers`).
|
||||
|
||||
### QMD backend (experimental)
|
||||
@@ -328,7 +331,7 @@ If you don't want to set an API key, use `memorySearch.provider = "local"` or se
|
||||
|
||||
Fallbacks:
|
||||
|
||||
- `memorySearch.fallback` can be `openai`, `gemini`, `local`, or `none`.
|
||||
- `memorySearch.fallback` can be `openai`, `gemini`, `voyage`, `mistral`, `local`, or `none`.
|
||||
- The fallback provider is only used when the primary embedding provider fails.
|
||||
|
||||
Batch indexing (OpenAI + Gemini + Voyage):
|
||||
|
||||
@@ -131,11 +131,13 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- OpenRouter: `openrouter` (`OPENROUTER_API_KEY`)
|
||||
- Example model: `openrouter/anthropic/claude-sonnet-4-5`
|
||||
- xAI: `xai` (`XAI_API_KEY`)
|
||||
- Mistral: `mistral` (`MISTRAL_API_KEY`)
|
||||
- Example model: `mistral/mistral-large-latest`
|
||||
- CLI: `openclaw onboard --auth-choice mistral-api-key`
|
||||
- Groq: `groq` (`GROQ_API_KEY`)
|
||||
- Cerebras: `cerebras` (`CEREBRAS_API_KEY`)
|
||||
- GLM models on Cerebras use ids `zai-glm-4.7` and `zai-glm-4.6`.
|
||||
- OpenAI-compatible base URL: `https://api.cerebras.ai/v1`.
|
||||
- Mistral: `mistral` (`MISTRAL_API_KEY`)
|
||||
- GitHub Copilot: `github-copilot` (`COPILOT_GITHUB_TOKEN` / `GH_TOKEN` / `GITHUB_TOKEN`)
|
||||
- Hugging Face Inference: `huggingface` (`HUGGINGFACE_HUB_TOKEN` or `HF_TOKEN`) — OpenAI-compatible router; example model: `huggingface/deepseek-ai/DeepSeek-R1`; CLI: `openclaw onboard --auth-choice huggingface-api-key`. See [Hugging Face (Inference)](/providers/huggingface).
|
||||
|
||||
|
||||
@@ -91,6 +91,10 @@
|
||||
"source": "/moonshot",
|
||||
"destination": "/providers/moonshot"
|
||||
},
|
||||
{
|
||||
"source": "/mistral",
|
||||
"destination": "/providers/mistral"
|
||||
},
|
||||
{
|
||||
"source": "/openrouter",
|
||||
"destination": "/providers/openrouter"
|
||||
@@ -1066,6 +1070,7 @@
|
||||
"providers/bedrock",
|
||||
"providers/vercel-ai-gateway",
|
||||
"providers/moonshot",
|
||||
"providers/mistral",
|
||||
"providers/minimax",
|
||||
"providers/opencode",
|
||||
"providers/glm",
|
||||
|
||||
@@ -1251,14 +1251,15 @@ still need a real API key (`OPENAI_API_KEY` or `models.providers.openai.apiKey`)
|
||||
If you don't set a provider explicitly, OpenClaw auto-selects a provider when it
|
||||
can resolve an API key (auth profiles, `models.providers.*.apiKey`, or env vars).
|
||||
It prefers OpenAI if an OpenAI key resolves, otherwise Gemini if a Gemini key
|
||||
resolves. If neither key is available, memory search stays disabled until you
|
||||
configure it. If you have a local model path configured and present, OpenClaw
|
||||
resolves, then Voyage, then Mistral. If no remote key is available, memory
|
||||
search stays disabled until you configure it. If you have a local model path
|
||||
configured and present, OpenClaw
|
||||
prefers `local`.
|
||||
|
||||
If you'd rather stay local, set `memorySearch.provider = "local"` (and optionally
|
||||
`memorySearch.fallback = "none"`). If you want Gemini embeddings, set
|
||||
`memorySearch.provider = "gemini"` and provide `GEMINI_API_KEY` (or
|
||||
`memorySearch.remote.apiKey`). We support **OpenAI, Gemini, or local** embedding
|
||||
`memorySearch.remote.apiKey`). We support **OpenAI, Gemini, Voyage, Mistral, or local** embedding
|
||||
models - see [Memory](/concepts/memory) for the setup details.
|
||||
|
||||
### Does memory persist forever What are the limits
|
||||
|
||||
@@ -94,11 +94,27 @@ Note: Binary detection is best-effort across macOS/Linux/Windows; ensure the CLI
|
||||
}
|
||||
```
|
||||
|
||||
### Provider-only (Mistral Voxtral)
|
||||
|
||||
```json5
|
||||
{
|
||||
tools: {
|
||||
media: {
|
||||
audio: {
|
||||
enabled: true,
|
||||
models: [{ provider: "mistral", model: "voxtral-mini-latest" }],
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
## Notes & limits
|
||||
|
||||
- Provider auth follows the standard model auth order (auth profiles, env vars, `models.providers.*.apiKey`).
|
||||
- Deepgram picks up `DEEPGRAM_API_KEY` when `provider: "deepgram"` is used.
|
||||
- Deepgram setup details: [Deepgram (audio transcription)](/providers/deepgram).
|
||||
- Mistral setup details: [Mistral](/providers/mistral).
|
||||
- Audio providers can override `baseUrl`, `headers`, and `providerOptions` via `tools.media.audio`.
|
||||
- Default size cap is 20MB (`tools.media.audio.maxBytes`). Oversize audio is skipped for that model and the next entry is tried.
|
||||
- Default `maxChars` for audio is **unset** (full transcript). Set `tools.media.audio.maxChars` or per-entry `maxChars` to trim output.
|
||||
|
||||
@@ -175,11 +175,11 @@ If you omit `capabilities`, the entry is eligible for the list it appears in.
|
||||
|
||||
## Provider support matrix (OpenClaw integrations)
|
||||
|
||||
| Capability | Provider integration | Notes |
|
||||
| ---------- | ------------------------------------------------ | ------------------------------------------------- |
|
||||
| Image | OpenAI / Anthropic / Google / others via `pi-ai` | Any image-capable model in the registry works. |
|
||||
| Audio | OpenAI, Groq, Deepgram, Google | Provider transcription (Whisper/Deepgram/Gemini). |
|
||||
| Video | Google (Gemini API) | Provider video understanding. |
|
||||
| Capability | Provider integration | Notes |
|
||||
| ---------- | ------------------------------------------------ | --------------------------------------------------------- |
|
||||
| Image | OpenAI / Anthropic / Google / others via `pi-ai` | Any image-capable model in the registry works. |
|
||||
| Audio | OpenAI, Groq, Deepgram, Google, Mistral | Provider transcription (Whisper/Deepgram/Gemini/Voxtral). |
|
||||
| Video | Google (Gemini API) | Provider video understanding. |
|
||||
|
||||
## Recommended providers
|
||||
|
||||
@@ -190,7 +190,7 @@ If you omit `capabilities`, the entry is eligible for the list it appears in.
|
||||
|
||||
**Audio**
|
||||
|
||||
- `openai/gpt-4o-mini-transcribe`, `groq/whisper-large-v3-turbo`, or `deepgram/nova-3`.
|
||||
- `openai/gpt-4o-mini-transcribe`, `groq/whisper-large-v3-turbo`, `deepgram/nova-3`, or `mistral/voxtral-mini-latest`.
|
||||
- CLI fallback: `whisper-cli` (whisper-cpp) or `whisper`.
|
||||
- Deepgram setup: [Deepgram (audio transcription)](/providers/deepgram).
|
||||
|
||||
|
||||
@@ -44,6 +44,7 @@ See [Venice AI](/providers/venice).
|
||||
- [Together AI](/providers/together)
|
||||
- [Cloudflare AI Gateway](/providers/cloudflare-ai-gateway)
|
||||
- [Moonshot AI (Kimi + Kimi Coding)](/providers/moonshot)
|
||||
- [Mistral](/providers/mistral)
|
||||
- [OpenCode Zen](/providers/opencode)
|
||||
- [Amazon Bedrock](/providers/bedrock)
|
||||
- [Z.AI](/providers/zai)
|
||||
|
||||
54
docs/providers/mistral.md
Normal file
54
docs/providers/mistral.md
Normal file
@@ -0,0 +1,54 @@
|
||||
---
|
||||
summary: "Use Mistral models and Voxtral transcription with OpenClaw"
|
||||
read_when:
|
||||
- You want to use Mistral models in OpenClaw
|
||||
- You need Mistral API key onboarding and model refs
|
||||
title: "Mistral"
|
||||
---
|
||||
|
||||
# Mistral
|
||||
|
||||
OpenClaw supports Mistral for both text/image model routing (`mistral/...`) and
|
||||
audio transcription via Voxtral in media understanding.
|
||||
Mistral can also be used for memory embeddings (`memorySearch.provider = "mistral"`).
|
||||
|
||||
## CLI setup
|
||||
|
||||
```bash
|
||||
openclaw onboard --auth-choice mistral-api-key
|
||||
# or non-interactive
|
||||
openclaw onboard --mistral-api-key "$MISTRAL_API_KEY"
|
||||
```
|
||||
|
||||
## Config snippet (LLM provider)
|
||||
|
||||
```json5
|
||||
{
|
||||
env: { MISTRAL_API_KEY: "sk-..." },
|
||||
agents: { defaults: { model: { primary: "mistral/mistral-large-latest" } } },
|
||||
}
|
||||
```
|
||||
|
||||
## Config snippet (audio transcription with Voxtral)
|
||||
|
||||
```json5
|
||||
{
|
||||
tools: {
|
||||
media: {
|
||||
audio: {
|
||||
enabled: true,
|
||||
models: [{ provider: "mistral", model: "voxtral-mini-latest" }],
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- Mistral auth uses `MISTRAL_API_KEY`.
|
||||
- Provider base URL defaults to `https://api.mistral.ai/v1`.
|
||||
- Onboarding default model is `mistral/mistral-large-latest`.
|
||||
- Media-understanding default audio model for Mistral is `voxtral-mini-latest`.
|
||||
- Media transcription path uses `/v1/audio/transcriptions`.
|
||||
- Memory embeddings path uses `/v1/embeddings` (default model: `mistral-embed`).
|
||||
@@ -39,6 +39,7 @@ See [Venice AI](/providers/venice).
|
||||
- [Vercel AI Gateway](/providers/vercel-ai-gateway)
|
||||
- [Cloudflare AI Gateway](/providers/cloudflare-ai-gateway)
|
||||
- [Moonshot AI (Kimi + Kimi Coding)](/providers/moonshot)
|
||||
- [Mistral](/providers/mistral)
|
||||
- [Synthetic](/providers/synthetic)
|
||||
- [OpenCode Zen](/providers/opencode)
|
||||
- [Z.AI](/providers/zai)
|
||||
|
||||
@@ -67,6 +67,7 @@ Semantic memory search uses **embedding APIs** when configured for remote provid
|
||||
- `memorySearch.provider = "openai"` → OpenAI embeddings
|
||||
- `memorySearch.provider = "gemini"` → Gemini embeddings
|
||||
- `memorySearch.provider = "voyage"` → Voyage embeddings
|
||||
- `memorySearch.provider = "mistral"` → Mistral embeddings
|
||||
- Optional fallback to a remote provider if local embeddings fail
|
||||
|
||||
You can keep it local with `memorySearch.provider = "local"` (no API usage).
|
||||
|
||||
@@ -86,6 +86,16 @@ Add `--json` for a machine-readable summary.
|
||||
--gateway-bind loopback
|
||||
```
|
||||
</Accordion>
|
||||
<Accordion title="Mistral example">
|
||||
```bash
|
||||
openclaw onboard --non-interactive \
|
||||
--mode local \
|
||||
--auth-choice mistral-api-key \
|
||||
--mistral-api-key "$MISTRAL_API_KEY" \
|
||||
--gateway-port 18789 \
|
||||
--gateway-bind loopback
|
||||
```
|
||||
</Accordion>
|
||||
<Accordion title="Synthetic example">
|
||||
```bash
|
||||
openclaw onboard --non-interactive \
|
||||
|
||||
Reference in New Issue
Block a user