Skip to content

Releases: BerriAI/litellm

v1.67.3.dev1

24 Apr 06:01
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.67.2-nightly...v1.67.3.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 235.18092614107948 6.181088327781123 0.0 1850 0 192.45027600004505 4892.269687999942
Aggregated Passed ✅ 210.0 235.18092614107948 6.181088327781123 0.0 1850 0 192.45027600004505 4892.269687999942

v1.67.2-nightly

24 Apr 05:56
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.67.1-nightly...v1.67.2-nightly

v1.67.1-nightly

22 Apr 22:55
a7db0df
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.67.0-nightly...v1.67.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 263.64700999835935 6.1132795166960605 0.0 1829 0 199.11094299999377 4358.182531000011
Aggregated Passed ✅ 220.0 263.64700999835935 6.1132795166960605 0.0 1829 0 199.11094299999377 4358.182531000011

v1.67.0-stable

19 Apr 19:35
03b5399
Compare
Choose a tag to compare

What's Changed

New Contributors

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.0-stable

Full Changelog: v1.66.0-stable...v1.67.0-stable

v1.67.0-nightly

19 Apr 23:32
Compare
Choose a tag to compare

What's Changed

  • [Feat] Expose Responses API on LiteLLM UI Test Key Page by @ishaan-jaff in #10166
  • [Bug Fix] Spend Tracking Bug Fix, don't modify in memory default litellm params by @ishaan-jaff in #10167
  • Bug Fix - Responses API, Loosen restrictions on allowed environments for computer use tool by @ishaan-jaff in #10168

Full Changelog: v1.67.0-stable...v1.67.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 262.85419851041036 6.266552109647687 0.0 1873 0 202.24337799993464 5393.98836700002
Aggregated Passed ✅ 230.0 262.85419851041036 6.266552109647687 0.0 1873 0 202.24337799993464 5393.98836700002

v1.66.3.dev5

19 Apr 03:41
3d5022b
Compare
Choose a tag to compare

What's Changed

  • [Feat] Unified Responses API - Add Azure Responses API support by @ishaan-jaff in #10116
  • UI: Make columns resizable/hideable in Models table by @msabramo in #10119
  • Remove unnecessary package*.json files by @msabramo in #10075
  • Add Gemini Flash 2.5 Preview Model Price and Context Window by @drmingler in #10125
  • test: update tests to new deployment model by @krrishdholakia in #10142
  • [Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama by @ishaan-jaff in #10132

New Contributors

Full Changelog: v1.66.2.dev1...v1.66.3.dev5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3.dev5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 241.46378394371686 6.1149592690003 0.0 1830 0 197.6759699999775 1416.5823339999974
Aggregated Passed ✅ 230.0 241.46378394371686 6.1149592690003 0.0 1830 0 197.6759699999775 1416.5823339999974

v1.66.3.dev1

18 Apr 02:27
Compare
Choose a tag to compare

What's Changed

  • [Feat] Unified Responses API - Add Azure Responses API support by @ishaan-jaff in #10116
  • UI: Make columns resizable/hideable in Models table by @msabramo in #10119

Full Changelog: v1.66.2.dev1...v1.66.3.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 210.74489628810068 6.401988824678471 0.003341330284278951 1916 1 38.52582800004711 5506.760536000002
Aggregated Passed ✅ 180.0 210.74489628810068 6.401988824678471 0.003341330284278951 1916 1 38.52582800004711 5506.760536000002

v1.66.3-nightly

17 Apr 20:58
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.2-nightly...v1.66.3-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 250.0 302.3290337319068 6.097097387542003 0.04679789661490572 1824 14 218.4401190000358 5459.562037000012
Aggregated Failed ❌ 250.0 302.3290337319068 6.097097387542003 0.04679789661490572 1824 14 218.4401190000358 5459.562037000012

v1.66.2.dev1

17 Apr 20:36
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.2-nightly...v1.66.2.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.2.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 242.7078390639904 6.1689738182726535 0.0 1844 0 181.44264199997906 6553.659710999966
Aggregated Passed ✅ 200.0 242.7078390639904 6.1689738182726535 0.0 1844 0 181.44264199997906 6553.659710999966

v1.66.2-nightly

17 Apr 05:43
47e811d
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.66.1-nightly...v1.66.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 244.45508035967268 6.136194497326665 0.0 1835 0 169.77143499997283 8723.871383000016
Aggregated Passed ✅ 190.0 244.45508035967268 6.136194497326665 0.0 1835 0 169.77143499997283 8723.871383000016