How to add reasoning_content with CustomLLM? #9489
Unanswered
KyleZhang0536
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi team,
We have an upstream llm api, it's incoming looks like
and output chunk
So we want to use CustomLLM and trans it into openai compatible format.
According to this doc, my CustomLLM return Iterator[GenericStreamingChunk] as response, but GenericStreamingChunk don't have reasoning_content.
And the response will be processed here
litellm/litellm/litellm_core_utils/streaming_handler.py
Line 1015 in 122ee63
How to add reasoning_content with CustomLLM?
Beta Was this translation helpful? Give feedback.
All reactions