Get openai request ID from langchain #21074
Unanswered
sanjeevanahilan
asked this question in
Q&A
Replies: 2 comments 1 reply
-
Upgrade to a new version of langchain and use the openai partner package. 0.0.343 is very old now |
Beta Was this translation helpful? Give feedback.
0 replies
-
I could confirm that when stream=True, the id is set by langchain(with "run-" prefix). and stream=False, id is set with id of openai api correctly. I am using the latest langchain & langchain-openai version in "langchain_openai/chat_models/base.py" def _convert_delta_to_message_chunk(
_dict: Mapping[str, Any], default_class: type[BaseMessageChunk]
) -> BaseMessageChunk:
id_ = _dict.get("id")
role = cast(str, _dict.get("role"))
content = cast(str, _dict.get("content") or "")
additional_kwargs: dict = {} with stream mode, langchain-openai try to get id in chunk, but actually "id" should be get from response as i think. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
When calling ainvoke or agenerate I don't receive in the output the OpenAI request id. e.g. chatcmpl-9IaVqJEDF1IxZgiVui6YZmqNMEktY
This would come from the full response object. I need to track the id of each request for my application - how can I do this? Is there an optional argument I can provide?
System Info
langchain==0.0.343
Beta Was this translation helpful? Give feedback.
All reactions