How to Best Implement LLM Summaries of Streamed Values in LangChain? #28192
nick-youngblut
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I’m working on a project where I use LangChain to create a react agent. I want to generate concise summaries of each step in the agent’s workflow as it streams. Currently, I’m using a separate LLM chain to handle the summaries, but I’m wondering if there’s a more efficient or better way to implement this.
Here’s a simplified version of my code:
Example output:
One problem with this approach is that langsmith tracks each invocation of
step_summary_chain
. I've tried to disable langsmith tracking ofstep_summary_chain
invocations, but I haven't found a solution.Beta Was this translation helpful? Give feedback.
All reactions