You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Bug]: When using streaming with the OpenAI API, the ID received with each chunk isn't from OpenAI; rather, it's generated by the LiteLLM client.
#10280
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
The chunk response ID "chatcmpl-fcd248be-a3b9-45ee-bbd9-1ef2b74d3681" is not the OpenAI response. I looked at the code and found that in the file /litellm/litellm_core_utils/streaming_handler, within the method def chunk_creator(self, chunk: Any), the chunk information is not passed when self.model_response_creator() is called. Consequently, the SDK automatically generates an ID and assigns it to self.response_id. This prevents subsequent processes from being able to update this ID.
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.66.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
request
response
Some explanations
The chunk response ID "chatcmpl-fcd248be-a3b9-45ee-bbd9-1ef2b74d3681" is not the OpenAI response. I looked at the code and found that in the file /litellm/litellm_core_utils/streaming_handler, within the method def chunk_creator(self, chunk: Any), the chunk information is not passed when self.model_response_creator() is called. Consequently, the SDK automatically generates an ID and assigns it to self.response_id. This prevents subsequent processes from being able to update this ID.
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.66.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: