Skip to content

[Bug]: Vercel AI SDK OpenAI provider fails with claude-3-7-sonnet when thinking mode is enabled ("Expected 'id' to be a string") #8825

@jerry-intrii

Description

@jerry-intrii

What happened?

Description

When using LiteLLM v1.61.16 with claude-3-7-sonnet and enabling the thinking mode, the Vercel AI SDK's OpenAI Provider fails to process the response due to missing tool_calls fields, resulting in an "Expected 'id' to be a string" error.

Environment

  • LiteLLM version: 1.61.16
  • Claude model: claude-3-7-sonnet
  • Integration: Self-hosted LiteLLM with Vercel AI SDK OpenAI Provider

Steps to Reproduce

  1. Set up LiteLLM with claude-3-7-sonnet
  2. Enable thinking mode in litellm_params:
{
  "thinking": {
    "type": "enabled",
    "budget_tokens": 32000
  }
}
  1. Make a request through Vercel AI SDK's OpenAI Provider to the LiteLLM endpoint

Current Behavior

The request fails in browser's console with error: "Expected 'id' to be a string"

Expected Behavior

The request should process successfully, handling the thinking mode response format appropriately.

Additional Context

  1. Response format without thinking mode:
ChatCompletionMessage(
    content="...",
    refusal=None,
    role='assistant',
    audio=None,
    function_call=None,
    tool_calls=None
)
  1. Response format with thinking mode enabled:
ChatCompletionMessage(
    content="...",
    reasoning_content="..."
)

Possible Solution

The LiteLLM implementation may need to ensure that responses maintain compatibility with OpenAI's response format when thinking mode is enabled, particularly regarding the tool_calls field structure expected by Vercel AI SDK.

Image

Relevant log output

(node:internal/webstreams/transformstream:573:10)\\n    at node:internal/webstreams/transformstream:378:16\\n    at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1129:5)\\n    at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1244:5)\\n    at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1118:3)\\n    at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:1008:3)\\n    at [kChunk] (node:internal/webstreams/readablestream:1586:31)\\n    at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:2119:24)\\n    at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2311:5)\\n    at transformStreamDefaultControllerEnqueue (node:internal/webstreams/transformstream:508:5)\\n    at TransformStreamDefaultController.enqueue (node:internal/webstreams/transformstream:324:5)\\n    at Object.transform (node:internal/webstreams/encoding:142:22)\"\n}"

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.61.16

Twitter / LinkedIn details

No response

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions