mirror of
https://github.com/openclaw/openclaw.git
synced 2026-03-30 04:53:33 +00:00
docs(ollama): update onboarding flow
Co-Authored-By: Jeffrey Morgan <jmorganca@gmail.com> (cherry picked from commit e8ca2ff4e522f2d971801a537b3c4fdfecde0711)
This commit is contained in:
committed by
Peter Steinberger
parent
0068f55dd8
commit
f906bf58db
@@ -37,7 +37,7 @@ Looking for chat channel docs (WhatsApp/Telegram/Discord/Slack/Mattermost (plugi
|
||||
- [Mistral](/providers/mistral)
|
||||
- [Moonshot AI (Kimi + Kimi Coding)](/providers/moonshot)
|
||||
- [NVIDIA](/providers/nvidia)
|
||||
- [Ollama (local models)](/providers/ollama)
|
||||
- [Ollama (cloud + local models)](/providers/ollama)
|
||||
- [OpenAI (API + Codex)](/providers/openai)
|
||||
- [OpenCode (Zen + Go)](/providers/opencode)
|
||||
- [OpenRouter](/providers/openrouter)
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
---
|
||||
summary: "Run OpenClaw with Ollama (local LLM runtime)"
|
||||
summary: "Run OpenClaw with Ollama (cloud and local models)"
|
||||
read_when:
|
||||
- You want to run OpenClaw with local models via Ollama
|
||||
- You want to run OpenClaw with cloud or local models via Ollama
|
||||
- You need Ollama setup and configuration guidance
|
||||
title: "Ollama"
|
||||
---
|
||||
@@ -16,6 +16,42 @@ Ollama is a local LLM runtime that makes it easy to run open-source models on yo
|
||||
|
||||
## Quick start
|
||||
|
||||
### Onboarding wizard (recommended)
|
||||
|
||||
The fastest way to set up Ollama is through the onboarding wizard:
|
||||
|
||||
```bash
|
||||
openclaw onboard
|
||||
```
|
||||
|
||||
Select **Ollama** from the provider list. The wizard will:
|
||||
|
||||
1. Ask for the Ollama base URL where your instance can be reached (default `http://127.0.0.1:11434`).
|
||||
2. Let you choose **Cloud + Local** (cloud models and local models) or **Local** (local models only).
|
||||
3. Open a browser sign-in flow if you choose **Cloud + Local** and are not signed in to ollama.com.
|
||||
4. Discover available models and suggest defaults.
|
||||
5. Auto-pull the selected model if it is not available locally.
|
||||
|
||||
Non-interactive mode is also supported:
|
||||
|
||||
```bash
|
||||
openclaw onboard --non-interactive \
|
||||
--auth-choice ollama \
|
||||
--accept-risk
|
||||
```
|
||||
|
||||
Optionally specify a custom base URL or model:
|
||||
|
||||
```bash
|
||||
openclaw onboard --non-interactive \
|
||||
--auth-choice ollama \
|
||||
--custom-base-url "http://ollama-host:11434" \
|
||||
--custom-model-id "qwen3.5:27b" \
|
||||
--accept-risk
|
||||
```
|
||||
|
||||
### Manual setup
|
||||
|
||||
1. Install Ollama: [https://ollama.com/download](https://ollama.com/download)
|
||||
|
||||
2. Pull a local model if you want local inference:
|
||||
@@ -28,7 +64,7 @@ ollama pull gpt-oss:20b
|
||||
ollama pull llama3.3
|
||||
```
|
||||
|
||||
3. If you want Ollama Cloud models too, sign in:
|
||||
3. If you want cloud models too, sign in:
|
||||
|
||||
```bash
|
||||
ollama signin
|
||||
@@ -41,7 +77,7 @@ openclaw onboard
|
||||
```
|
||||
|
||||
- `Local`: local models only
|
||||
- `Cloud + Local`: local models plus Ollama Cloud models
|
||||
- `Cloud + Local`: local models plus cloud models
|
||||
- Cloud models such as `kimi-k2.5:cloud`, `minimax-m2.5:cloud`, and `glm-5:cloud` do **not** require a local `ollama pull`
|
||||
|
||||
OpenClaw currently suggests:
|
||||
@@ -191,6 +227,14 @@ Once configured, all your Ollama models are available:
|
||||
}
|
||||
```
|
||||
|
||||
## Cloud models
|
||||
|
||||
Cloud models let you run cloud-hosted models (for example `kimi-k2.5:cloud`, `minimax-m2.5:cloud`, `glm-5:cloud`) alongside your local models.
|
||||
|
||||
To use cloud models, select **Cloud + Local** mode during onboarding. The wizard checks whether you are signed in and opens a browser sign-in flow when needed. If authentication cannot be verified, the wizard falls back to local model defaults.
|
||||
|
||||
You can also sign in directly at [ollama.com/signin](https://ollama.com/signin).
|
||||
|
||||
## Advanced
|
||||
|
||||
### Reasoning models
|
||||
|
||||
Reference in New Issue
Block a user