Skip to content

[BUG] tokens too large for default #346

@enoch3712

Description

@enoch3712

"Make sure that the model you're using supports vision features: litellm.BadRequestError: OpenAIException - max_tokens is too large: 32000. This model supports at most 16384 completion tokens, whereas you provided 32000."

This happens in gpt4o

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions