Skip to content

[Bug]: codex-mini error #14663

@superpoussin22

Description

@superpoussin22

What happened?

when using codex-mini-latest model with kilo code got error , ame in litellm ui if completion is slected

Relevant log output

iteLLM streaming error: 400 litellm.BadRequestError: OpenAIException - {
  "error": {
    "message": "Unknown parameter: 'stream_options.include_usage'.",
    "type": "invalid_request_error",
    "param": "stream_options.include_usage",
    "code": "unknown_parameter"
  }
}. Received Model Group=openai/codex-mini
Available Model Group Fallbacks=None

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

V1.77.2rc1

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions