mirror of
https://github.com/openclaw/openclaw.git
synced 2026-04-18 04:47:28 +00:00
fix(config): add openai-codex-responses to ModelApiSchema
The config schema validates provider api fields against ModelApiSchema, but openai-codex-responses was missing from the allowed values. This forces users to set api: "openai-responses" for the openai-codex provider, which routes requests to api.openai.com/v1/responses instead of chatgpt.com/backend-api/codex/responses, causing HTTP 401 errors because Codex OAuth tokens lack api.responses.write scope for the standard OpenAI Responses endpoint. The runtime already supports openai-codex-responses throughout: model registry, stream dispatch (streamOpenAICodexResponses), and provider detection (OPENAI_MODEL_APIS set). Only the config schema was missing the literal.
This commit is contained in:
committed by
Peter Steinberger
parent
d92fc85555
commit
861b90f79c
@@ -182,6 +182,7 @@ export const SecretsConfigSchema = z
|
||||
export const ModelApiSchema = z.union([
|
||||
z.literal("openai-completions"),
|
||||
z.literal("openai-responses"),
|
||||
z.literal("openai-codex-responses"),
|
||||
z.literal("anthropic-messages"),
|
||||
z.literal("google-generative-ai"),
|
||||
z.literal("github-copilot"),
|
||||
|
||||
Reference in New Issue
Block a user