Unable to get usage while using streaming function #32226
Unanswered
talhaanwarch
asked this question in
Q&A
Replies: 1 comment
-
You’re not alone — usage metadata often gets lost during streaming if the callback doesn’t capture it post-yield. One workaround I’ve used: Instead of streaming directly to the terminal, try buffering chunks and triggering the callback after the generator completes. Something like: chunks = []
for chunk in response_stream:
if hasattr(chunk, "content"):
chunks.append(chunk.content)
# force flush callback manually
callback.flush() # or check what your handler exposes Also double-check if your Let me know if that helps — ran into this exact hiccup while building a .txt‑based LLM agent framework recently |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Here is my code snippet
This is last chunk
Versions
langchain-openai: 0.3.28
langchain-core: 0.3.70
langchain: 0.3.26
langsmith: 0.4.8
openai: 1.97.0
Beta Was this translation helpful? Give feedback.
All reactions