mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-10 11:24:58 +00:00
Some OpenAI-format providers (via pi-ai) pre-subtract cached_tokens from prompt_tokens upstream. When cached_tokens exceeds prompt_tokens due to provider inconsistencies the subtraction produces a negative input value that flows through to the TUI status bar and /usage dashboard. Clamp rawInput to 0 in normalizeUsage() so downstream consumers never see nonsensical negative token counts. Closes #30765 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
4.8 KiB
4.8 KiB