-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
Description
When using LiteLLM v1.61.16 with claude-3-7-sonnet and enabling the thinking mode, the Vercel AI SDK's OpenAI Provider fails to process the response due to missing tool_calls fields, resulting in an "Expected 'id' to be a string" error.
Environment
- LiteLLM version: 1.61.16
- Claude model: claude-3-7-sonnet
- Integration: Self-hosted LiteLLM with Vercel AI SDK OpenAI Provider
Steps to Reproduce
- Set up LiteLLM with claude-3-7-sonnet
- Enable thinking mode in litellm_params:
{
"thinking": {
"type": "enabled",
"budget_tokens": 32000
}
}
- Make a request through Vercel AI SDK's OpenAI Provider to the LiteLLM endpoint
Current Behavior
The request fails in browser's console with error: "Expected 'id' to be a string"
Expected Behavior
The request should process successfully, handling the thinking mode response format appropriately.
Additional Context
- Response format without thinking mode:
ChatCompletionMessage(
content="...",
refusal=None,
role='assistant',
audio=None,
function_call=None,
tool_calls=None
)
- Response format with thinking mode enabled:
ChatCompletionMessage(
content="...",
reasoning_content="..."
)
Possible Solution
The LiteLLM implementation may need to ensure that responses maintain compatibility with OpenAI's response format when thinking mode is enabled, particularly regarding the tool_calls field structure expected by Vercel AI SDK.
Relevant log output
(node:internal/webstreams/transformstream:573:10)\\n at node:internal/webstreams/transformstream:378:16\\n at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1129:5)\\n at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1244:5)\\n at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1118:3)\\n at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:1008:3)\\n at [kChunk] (node:internal/webstreams/readablestream:1586:31)\\n at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:2119:24)\\n at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2311:5)\\n at transformStreamDefaultControllerEnqueue (node:internal/webstreams/transformstream:508:5)\\n at TransformStreamDefaultController.enqueue (node:internal/webstreams/transformstream:324:5)\\n at Object.transform (node:internal/webstreams/encoding:142:22)\"\n}"
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.61.16
Twitter / LinkedIn details
No response
Saran33
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working