-
-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Open
Labels
Description
What happened?
when using codex-mini-latest model with kilo code got error , ame in litellm ui if completion is slected
Relevant log output
iteLLM streaming error: 400 litellm.BadRequestError: OpenAIException - {
"error": {
"message": "Unknown parameter: 'stream_options.include_usage'.",
"type": "invalid_request_error",
"param": "stream_options.include_usage",
"code": "unknown_parameter"
}
}. Received Model Group=openai/codex-mini
Available Model Group Fallbacks=None
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
V1.77.2rc1
Twitter / LinkedIn details
No response