Skip to content

Releases: BerriAI/litellm

v1.70.1.dev8

20 May 22:12
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.70.1.dev8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 490.0 562.6894453097084 5.610163428556303 0.0033413719050365115 1679 1 195.35745899997892 1568.1852209999647
Aggregated Failed ❌ 490.0 562.6894453097084 5.610163428556303 0.0033413719050365115 1679 1 195.35745899997892 1568.1852209999647

v1.70.1.dev6

20 May 19:53
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev4...v1.70.1.dev6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 520.0 614.117503553192 5.499725111392901 0.003343297940056475 1645 1 470.59824999996636 40452.63952500005
Aggregated Failed ❌ 520.0 614.117503553192 5.499725111392901 0.003343297940056475 1645 1 470.59824999996636 40452.63952500005

v1.70.1.dev4

20 May 19:39
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev2...v1.70.1.dev4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 500.0 594.1888488711803 5.578781212015596 0.006685178204931811 1669 2 190.37631700001612 40183.55002499999
Aggregated Failed ❌ 500.0 594.1888488711803 5.578781212015596 0.006685178204931811 1669 2 190.37631700001612 40183.55002499999

v1.70.1.dev2

20 May 17:40
Compare
Choose a tag to compare

Full Changelog: v1.67.0-stable.patch2...v1.70.1.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 530.0 603.0679202423511 5.573003547206886 0.0 1667 0 488.8812669999538 2023.6071630000083
Aggregated Failed ❌ 530.0 603.0679202423511 5.573003547206886 0.0 1667 0 488.8812669999538 2023.6071630000083

v1.70.1.dev11

20 May 22:22
Compare
Choose a tag to compare

Full Changelog: v1.70.1.dev8...v1.70.1.dev11

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.1.dev11

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 510.0 588.5182217067622 5.584563874858318 0.0 1671 0 469.68324099998426 1859.8325959999897
Aggregated Failed ❌ 510.0 588.5182217067622 5.584563874858318 0.0 1671 0 469.68324099998426 1859.8325959999897

v1.67.0-stable.patch2

20 May 05:50
af73a8e
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.70.1-stable...v1.67.0-stable.patch2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.67.0-stable.patch2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 261.5294727693982 6.120234747528359 0.0 1830 0 217.36453699998037 1204.8032490000082
Aggregated Passed ✅ 250.0 261.5294727693982 6.120234747528359 0.0 1830 0 217.36453699998037 1204.8032490000082

v1.70.1-stable

17 May 16:12
Compare
Choose a tag to compare

What's Changed

New Contributors

Read more

v1.70.0-nightly

17 May 05:39
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.69.3-nightly...v1.70.0-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.70.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 206.64782881083894 6.289821899591554 0.0 1882 0 171.67403099995227 1154.6766310000294
Aggregated Passed ✅ 190.0 206.64782881083894 6.289821899591554 0.0 1882 0 171.67403099995227 1154.6766310000294

v1.69.3-nightly

15 May 15:01
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.69.2-nightly...v1.69.3-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.69.3-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 271.38705055312096 6.10229777560623 0.0 1826 0 225.92644499991366 2090.630247999968
Aggregated Passed ✅ 250.0 271.38705055312096 6.10229777560623 0.0 1826 0 225.92644499991366 2090.630247999968

v1.69.0.patch1-stable

15 May 04:36
Compare
Choose a tag to compare

Full Changelog: v1.69.0-stable...v1.69.0.patch1-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.69.0.patch1-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 225.79693725160666 6.241896325571918 0.0 1868 0 189.1762820000622 1291.5362580000078
Aggregated Passed ✅ 210.0 225.79693725160666 6.241896325571918 0.0 1868 0 189.1762820000622 1291.5362580000078