Skip to content

Releases: BerriAI/litellm

v1.71.1-stable

25 May 21:17
Compare
Choose a tag to compare

What's Changed

Read more

v1.71.1-nightly

25 May 15:14
Compare
Choose a tag to compare

What's Changed

  • Logfire - fix(opentelemetry.py): Fix otel proxy server initialization + Return abbreviated key in key not found error (easier clientside debugging) + Ignore invalid deployments on router load by @krrishdholakia in #11091
  • feat(handle_jwt.py): map user to team when added via jwt auth by @krrishdholakia in #11108
  • fix(ui_sso.py): maintain backwards compatibility for older user id formats + fix existing user email w/ trailing whitespace check + ensure default_internal_user_settings runs on all user new calls by @krrishdholakia in #11106
  • fix(route_llm_request.py): map team model from list in route llm request by @krrishdholakia in #11111
  • Remove + Check for unsafe enterprise/ folder imports by @krrishdholakia in #11107
  • Fix: Add Claude Sonnet 4 and Opus 4 support for reasoning_effort parameter by @keykbd in #11114
  • fix(session): correctly place litellm_session_id at root level instead of metadata by @dalssoft in #11088
  • fix(model_management_endpoints): clear cache and reload models after update by @jtong99 in #10853
  • [Feat] Add /image/edits on LiteLLM by @ishaan-jaff in #11123
  • Correctly delete team model alias when team only model is deleted (#… by @krrishdholakia in #11121
  • fix: detect and return status codes in streaming responses by @aholmberg in #10962
  • Fix passing standard optional params by @krrishdholakia in #11124
  • UI QA fix: team viewer should not see create team by @ishaan-jaff in #11127
  • [Chore]: feature flag aiohttp transport - users should opt into using aiohttp transport by @ishaan-jaff in #11132
  • v1.71.1-stable - notes by @ishaan-jaff in #11133
  • Litellm revert redis changes by @krrishdholakia in #11135
  • Litellm fix multi instance checks on teams by @krrishdholakia in #11137

New Contributors

Full Changelog: v1.71.0-nightly...v1.71.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 271.62186726419185 6.123952252359233 0.0 1832 0 215.75241199997208 1968.6522410000293
Aggregated Passed ✅ 250.0 271.62186726419185 6.123952252359233 0.0 1832 0 215.75241199997208 1968.6522410000293

v1.71.0-nightly

24 May 15:11
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.4-nightly...v1.71.0-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.71.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 310.0 321.9030490306742 5.995437687618243 4.166377779571852 1793 1246 259.8663090000173 771.521746000019
Aggregated Failed ❌ 310.0 321.9030490306742 5.995437687618243 4.166377779571852 1793 1246 259.8663090000173 771.521746000019

v1.70.2.dev6

24 May 04:51
Compare
Choose a tag to compare

Full Changelog: v1.70.2-nightly...v1.70.2.dev6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.2.dev6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 470.0 546.8056516133963 5.687025964404996 0.0 1702 0 432.45771799996646 2108.0635040000006
Aggregated Failed ❌ 470.0 546.8056516133963 5.687025964404996 0.0 1702 0 432.45771799996646 2108.0635040000006

v1.70.4-nightly

23 May 00:56
Compare
Choose a tag to compare

Full Changelog: v1.70.2.dev5...v1.70.4-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.4-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 530.0 592.8290290780234 5.619046633443039 0.0 1679 0 480.5262289999632 1595.1236809999614
Aggregated Failed ❌ 530.0 592.8290290780234 5.619046633443039 0.0 1679 0 480.5262289999632 1595.1236809999614

v1.70.2.dev5

22 May 21:26
197c608
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.2-nightly...v1.70.2.dev5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.2.dev5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 490.0 556.7359308349169 5.626256137716844 0.0 1684 0 437.79858300001706 2137.070654000013
Aggregated Failed ❌ 490.0 556.7359308349169 5.626256137716844 0.0 1684 0 437.79858300001706 2137.070654000013

v1.70.2-nightly

21 May 01:03
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev2...v1.70.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 480.0 551.6473700786378 5.693462012007638 0.0 1704 0 435.9962309999901 1522.4978100000044
Aggregated Failed ❌ 480.0 551.6473700786378 5.693462012007638 0.0 1704 0 435.9962309999901 1522.4978100000044

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.70.2-nightly

v1.70.1.dev8

20 May 22:12
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.70.1.dev8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 490.0 562.6894453097084 5.610163428556303 0.0033413719050365115 1679 1 195.35745899997892 1568.1852209999647
Aggregated Failed ❌ 490.0 562.6894453097084 5.610163428556303 0.0033413719050365115 1679 1 195.35745899997892 1568.1852209999647

v1.70.1.dev6

20 May 19:53
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev4...v1.70.1.dev6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 520.0 614.117503553192 5.499725111392901 0.003343297940056475 1645 1 470.59824999996636 40452.63952500005
Aggregated Failed ❌ 520.0 614.117503553192 5.499725111392901 0.003343297940056475 1645 1 470.59824999996636 40452.63952500005

v1.70.1.dev4

20 May 19:39
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev2...v1.70.1.dev4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 500.0 594.1888488711803 5.578781212015596 0.006685178204931811 1669 2 190.37631700001612 40183.55002499999
Aggregated Failed ❌ 500.0 594.1888488711803 5.578781212015596 0.006685178204931811 1669 2 190.37631700001612 40183.55002499999