Skip to content

Commit 2394cd4

Browse files
authored
stable release note fixes (#9954)
* docs fix * docs metrics * docs fix release notes * docs 1.66.0-stable
1 parent c86e678 commit 2394cd4

File tree

2 files changed

+35
-26
lines changed

2 files changed

+35
-26
lines changed

docs/my-website/docs/proxy/prometheus.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,14 @@ Use this for for tracking per [user, key, team, etc.](virtual_keys)
9595

9696
### Initialize Budget Metrics on Startup
9797

98-
If you want to initialize the key/team budget metrics on startup, you can set the `prometheus_initialize_budget_metrics` to `true` in the `config.yaml`
98+
If you want litellm to emit the budget metrics for all keys, teams irrespective of whether they are getting requests or not, set `prometheus_initialize_budget_metrics` to `true` in the `config.yaml`
99+
100+
**How this works:**
101+
102+
- If the `prometheus_initialize_budget_metrics` is set to `true`
103+
- Every 5 minutes litellm runs a cron job to read all keys, teams from the database
104+
- It then emits the budget metrics for each key, team
105+
- This is used to populate the budget metrics on the `/metrics` endpoint
99106

100107
```yaml
101108
litellm_settings:

docs/my-website/release_notes/v1.66.0-stable/index.md

Lines changed: 27 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ v1.66.0-stable is live now, here are the key highlights of this release
4646
## Key Highlights
4747
- **Microsoft SSO Auto-sync**: Auto-sync groups and group members from Azure Entra ID to LiteLLM
4848
- **Unified File IDs**: Use the same file id across LLM API providers.
49-
- **Realtime API Cost Tracking**: Track cost of realtime api calls
49+
- **Realtime API Cost Tracking**: Track cost of realtime API calls
5050
- **xAI grok-3**: Added support for `xai/grok-3` models
5151
- **Security Fixes**: Fixed [CVE-2025-0330](https://www.cve.org/CVERecord?id=CVE-2025-0330) and [CVE-2024-6825](https://www.cve.org/CVERecord?id=CVE-2024-6825) vulnerabilities
5252

@@ -62,10 +62,10 @@ Let's dive in.
6262
Auto-sync groups and members from Azure Entra ID to LiteLLM
6363
</p>
6464

65-
This release adds support for auto-syncing groups and members on Microsoft Entra ID with LiteLLM. This means that litellm proxy administrators can spend less time managing teams and members and LiteLLM handles the following:
65+
This release adds support for auto-syncing groups and members on Microsoft Entra ID with LiteLLM. This means that LiteLLM proxy administrators can spend less time managing teams and members and LiteLLM handles the following:
6666

67-
- Auto-create Teams that existing on Microsoft Entra ID
68-
- Sync team members on Microsoft Entra ID with LiteLLM Teams
67+
- Auto-create teams that exist on Microsoft Entra ID
68+
- Sync team members on Microsoft Entra ID with LiteLLM teams
6969

7070
Get started with this [here](https://docs.litellm.ai/docs/tutorials/msft_sso)
7171

@@ -76,42 +76,42 @@ Get started with this [here](https://docs.litellm.ai/docs/tutorials/msft_sso)
7676
## New Models / Updated Models
7777

7878
- xAI
79-
1. Added cost tracking for `xai/grok-3` models [PR](https://github.com/BerriAI/litellm/pull/9920)
80-
2. Added reasoning_effort support for `xai/grok-3-mini-beta` model family [PR](https://github.com/BerriAI/litellm/pull/9932)
79+
1. Added reasoning_effort support for `xai/grok-3-mini-beta` [Get Started](https://docs.litellm.ai/docs/providers/xai#reasoning-usage)
80+
2. Added cost tracking for `xai/grok-3` models [PR](https://github.com/BerriAI/litellm/pull/9920)
8181

8282
- Hugging Face
83-
1. Hugging Face - Added inference providers support [Getting Started](https://docs.litellm.ai/docs/providers/huggingface#serverless-inference-providers)
83+
1. Added inference providers support [Get Started](https://docs.litellm.ai/docs/providers/huggingface#serverless-inference-providers)
8484

8585
- Azure
86-
1. Azure - Added azure/gpt-4o-realtime-audio cost tracking [PR](https://github.com/BerriAI/litellm/pull/9893)
86+
1. Added azure/gpt-4o-realtime-audio cost tracking [PR](https://github.com/BerriAI/litellm/pull/9893)
8787

8888
- VertexAI
89-
1. VertexAI - Added enterpriseWebSearch tool support [PR](https://github.com/BerriAI/litellm/pull/9856)
90-
2. VertexAI - Moved to only passing in accepted keys by vertex ai response schema [PR](https://github.com/BerriAI/litellm/pull/8992)
89+
1. Added enterpriseWebSearch tool support [Get Started](https://docs.litellm.ai/docs/providers/vertex#grounding---web-search)
90+
2. Moved to only passing keys accepted by the Vertex AI response schema [PR](https://github.com/BerriAI/litellm/pull/8992)
9191

9292
- Google AI Studio
93-
1. Google AI Studio - Added cost tracking for `gemini-2.5-pro` [PR](https://github.com/BerriAI/litellm/pull/9837)
94-
2. Google AI Studio - Fixed pricing for 'gemini/gemini-2.5-pro-preview-03-25' [PR](https://github.com/BerriAI/litellm/pull/9896)
95-
3. Google AI Studio - Fixed handling file_data being passed in [PR](https://github.com/BerriAI/litellm/pull/9786)
93+
1. Added cost tracking for `gemini-2.5-pro` [PR](https://github.com/BerriAI/litellm/pull/9837)
94+
2. Fixed pricing for 'gemini/gemini-2.5-pro-preview-03-25' [PR](https://github.com/BerriAI/litellm/pull/9896)
95+
3. Fixed handling file_data being passed in [PR](https://github.com/BerriAI/litellm/pull/9786)
9696

9797
- Azure
98-
1. Azure - Updated Azure Phi-4 pricing [PR](https://github.com/BerriAI/litellm/pull/9862)
99-
2. Azure - Added azure/gpt-4o-realtime-audio cost tracking [PR](https://github.com/BerriAI/litellm/pull/9893)
98+
1. Updated Azure Phi-4 pricing [PR](https://github.com/BerriAI/litellm/pull/9862)
99+
2. Added azure/gpt-4o-realtime-audio cost tracking [PR](https://github.com/BerriAI/litellm/pull/9893)
100100

101101
- Databricks
102-
1. Databricks - Removed reasoning_effort from parameters [PR](https://github.com/BerriAI/litellm/pull/9811)
102+
1. Removed reasoning_effort from parameters [PR](https://github.com/BerriAI/litellm/pull/9811)
103103
2. Fixed custom endpoint check for Databricks [PR](https://github.com/BerriAI/litellm/pull/9925)
104104

105105
- General
106-
1. Function Calling - Handle pydantic base model in message tool calls, handle tools = [], and support fake streaming on tool calls for meta.llama3-3-70b-instruct-v1:0 [PR](https://github.com/BerriAI/litellm/pull/9774)
107-
2. LiteLLM Proxy - Allow passing `thinking` param to litellm proxy via client sdk [PR](https://github.com/BerriAI/litellm/pull/9386)
108-
3. Reasoning - Added litellm.supports_reasoning() util to track if an llm supports reasoning [PR](https://github.com/BerriAI/litellm/pull/9923)
106+
1. Added litellm.supports_reasoning() util to track if an llm supports reasoning [Get Started](https://docs.litellm.ai/docs/providers/anthropic#reasoning)
107+
2. Function Calling - Handle pydantic base model in message tool calls, handle tools = [], and support fake streaming on tool calls for meta.llama3-3-70b-instruct-v1:0 [PR](https://github.com/BerriAI/litellm/pull/9774)
108+
3. LiteLLM Proxy - Allow passing `thinking` param to litellm proxy via client sdk [PR](https://github.com/BerriAI/litellm/pull/9386)
109109
4. Fixed correctly translating 'thinking' param for litellm [PR](https://github.com/BerriAI/litellm/pull/9904)
110110

111111

112112
## Spend Tracking Improvements
113113
- OpenAI, Azure
114-
1. Realtime API Cost tracking with token usage metrics in spend logs [PR](https://github.com/BerriAI/litellm/pull/9795)
114+
1. Realtime API Cost tracking with token usage metrics in spend logs [Get Started](https://docs.litellm.ai/docs/realtime)
115115
- Anthropic
116116
1. Fixed Claude Haiku cache read pricing per token [PR](https://github.com/BerriAI/litellm/pull/9834)
117117
2. Added cost tracking for Claude responses with base_model [PR](https://github.com/BerriAI/litellm/pull/9897)
@@ -134,32 +134,34 @@ Get started with this [here](https://docs.litellm.ai/docs/tutorials/msft_sso)
134134
View input, output, reasoning tokens, ttft metrics.
135135
</p>
136136
2. Tag / Policy Management:
137-
1. Added Tag/Policy Management [PR](https://github.com/BerriAI/litellm/pull/9813)
137+
1. Added Tag/Policy Management. Create routing rules based on request metadata. This allows you to enforce that requests with `tags="private"` only go to specific models. [Get Started](https://docs.litellm.ai/docs/tutorials/tag_management)
138+
139+
<br />
138140

139141
<Image
140142
img={require('../../img/release_notes/tag_management.png')}
141143
style={{width: '100%', display: 'block'}}
142144
/>
143145
<p style={{textAlign: 'left', color: '#666'}}>
144-
Tag / Policy Management
146+
Create and manage tags.
145147
</p>
146148
3. Redesigned Login Screen:
147149
1. Polished login screen [PR](https://github.com/BerriAI/litellm/pull/9778)
148150
2. Microsoft SSO Auto-Sync:
149151
1. Added debug route to allow admins to debug SSO JWT fields [PR](https://github.com/BerriAI/litellm/pull/9835)
150152
2. Added ability to use MSFT Graph API to assign users to teams [PR](https://github.com/BerriAI/litellm/pull/9865)
151-
3. Connected LiteLLM to Azure Entra ID Enterprise Application [PR](https://github.com/BerriAI/litellm/pull/9872)
153+
3. Connected litellm to Azure Entra ID Enterprise Application [PR](https://github.com/BerriAI/litellm/pull/9872)
152154
4. Added ability for admins to set `default_team_params` for when litellm SSO creates default teams [PR](https://github.com/BerriAI/litellm/pull/9895)
153155
5. Fixed MSFT SSO to use correct field for user email [PR](https://github.com/BerriAI/litellm/pull/9886)
154-
6. Added UI support for setting Default Team setting when LiteLLM SSO auto creates teams [PR](https://github.com/BerriAI/litellm/pull/9918)
156+
6. Added UI support for setting Default Team setting when litellm SSO auto creates teams [PR](https://github.com/BerriAI/litellm/pull/9918)
155157
5. UI Bug Fixes:
156158
1. Prevented team, key, org, model numerical values changing on scrolling [PR](https://github.com/BerriAI/litellm/pull/9776)
157159
2. Instantly reflect key and team updates in UI [PR](https://github.com/BerriAI/litellm/pull/9825)
158160

159161
## Logging / Guardrail Improvements
160162

161163
1. Prometheus:
162-
- Emit Key and Team Budget metrics on a cron job schedule [PR](https://github.com/BerriAI/litellm/pull/9528)
164+
- Emit Key and Team Budget metrics on a cron job schedule [Get Started](https://docs.litellm.ai/docs/proxy/prometheus#initialize-budget-metrics-on-startup)
163165

164166
## Security Fixes
165167

0 commit comments

Comments
 (0)