Skip to content

Issues: BerriAI/litellm

Beta
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Bug]: strict response format example is incorrect bug Something isn't working
#10320 opened Apr 25, 2025 by Tomas2D
[Bug]: model_group_aliased model does not respect fallbacks bug Something isn't working
#10317 opened Apr 25, 2025 by arkadiy-telegin
[Bug]: Image Model custom cost computation failing bug Something isn't working
#10316 opened Apr 25, 2025 by winningcode
[Bug]: incorrect prompt_tokens bug Something isn't working
#10311 opened Apr 25, 2025 by reymondzzzz
ui_custom_path issue
#10310 opened Apr 25, 2025 by samuel123-sys711
[Feature]: support service_tier for o3 and o4-mini enhancement New feature or request
#10307 opened Apr 25, 2025 by alexzeitgeist
[Feature]: support Streaming STT from fireworks enhancement New feature or request
#10304 opened Apr 25, 2025 by MyButtermilk
[Bug]: OpenRouter Grok Model Exception In Proxy Method bug Something isn't working
#10303 opened Apr 25, 2025 by yekangming
[Bug]: custom_llm_provider april 2025 bug Something isn't working mlops user request
#10287 opened Apr 24, 2025 by abarahonar
ProTip! Find all open issues with in progress development work with linked:pr.