mirror of
https://github.com/openclaw/openclaw.git
synced 2026-04-18 18:37:26 +00:00
feat(openai): add websocket warm-up with configurable toggle
This commit is contained in:
@@ -45,6 +45,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- CLI: `openclaw onboard --auth-choice openai-api-key`
|
||||
- Default transport is `auto` (WebSocket-first, SSE fallback)
|
||||
- Override per model via `agents.defaults.models["openai/<model>"].params.transport` (`"sse"`, `"websocket"`, or `"auto"`)
|
||||
- OpenAI Responses WebSocket warm-up defaults to enabled via `params.openaiWsWarmup` (`true`/`false`)
|
||||
|
||||
```json5
|
||||
{
|
||||
|
||||
@@ -68,6 +68,9 @@ You can set `agents.defaults.models.<provider/model>.params.transport`:
|
||||
- `"websocket"`: force WebSocket
|
||||
- `"auto"`: try WebSocket, then fall back to SSE
|
||||
|
||||
For `openai/*` (Responses API), OpenClaw also enables WebSocket warm-up by
|
||||
default (`openaiWsWarmup: true`) when WebSocket transport is used.
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: {
|
||||
@@ -85,6 +88,47 @@ You can set `agents.defaults.models.<provider/model>.params.transport`:
|
||||
}
|
||||
```
|
||||
|
||||
### OpenAI WebSocket warm-up
|
||||
|
||||
OpenAI docs describe warm-up as optional. OpenClaw enables it by default for
|
||||
`openai/*` to reduce first-turn latency when using WebSocket transport.
|
||||
|
||||
### Disable warm-up
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: {
|
||||
defaults: {
|
||||
models: {
|
||||
"openai/gpt-5": {
|
||||
params: {
|
||||
openaiWsWarmup: false,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Enable warm-up explicitly
|
||||
|
||||
```json5
|
||||
{
|
||||
agents: {
|
||||
defaults: {
|
||||
models: {
|
||||
"openai/gpt-5": {
|
||||
params: {
|
||||
openaiWsWarmup: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### OpenAI Responses server-side compaction
|
||||
|
||||
For direct OpenAI Responses models (`openai/*` using `api: "openai-responses"` with
|
||||
|
||||
Reference in New Issue
Block a user