Replies: 4 comments 6 replies
-
¡Hola de nuevo @alexgg278! Es un placer verte por aquí otra vez. Espero que te encuentres bien. Based on the information you've provided and the context I've found, it seems like you're experiencing an issue where the However, I found a similar issue in the LangChain repository: Issue: How can I get back a streaming response with AgentExecutors?. This issue was closed, but it might contain some useful information or a solution to your problem. Also, there's an unsolved discussion: AgentExecutor is not streaming the output. This discussion might be related to your issue, and keeping an eye on it could provide some insights or solutions. Regarding the async def astream_events(
self,
input: Input,
config: Optional[RunnableConfig] = None,
**kwargs: Optional[Any],
) -> AsyncIterator[StreamEvent]:
async for item in self.bound.astream_events(
input, self._merge_configs(config), **{**self.kwargs, **kwargs}
):
yield item You can find this in libs/core/langchain_core/runnables/base.py. The events returned by the I hope this information helps. If you have more details or if there's anything else you'd like to know, please let me know. Sources
|
Beta Was this translation helpful? Give feedback.
-
CLosed |
Beta Was this translation helpful? Give feedback.
-
Same here! I have a setup running locally with |
Beta Was this translation helpful? Give feedback.
-
Hello everyone, I don't know if this will help you solve the issue or not but give it a try. I was using Python 3.11 locally and the Here, you can see that when using Python 3.10, we don't get the But when switching to Python 3.11, we get it: So the solution for me was to just change the Python version to 3.11. @TomTom101 @straygar @mdenison |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to use astream_events on a langchain AgentExecutor with the goal of only streaming the final output generation of the Agent. I have read in the documentation that to get the streaming, the events should be filtered looking only for "on_chat_model_stream" events. However, when I print all the events I don't see any "on_chat_model_stream" events, not even in between the "on_chat_model_start" and the "on_chatModel_end" events.
Take a look to the screnshoot:
In the docs there is a clear example on how to stream the final answer, using "on_chat_model_stream", but you can only do that if you get this events out of the iterator: https://python.langchain.com/docs/expression_language/streaming#filtering-events
System Info
langchain-core = "0.1.16"
openai = ">1"
langchain-openai = "^0.0.5"
langchain = "0.1.0"
Beta Was this translation helpful? Give feedback.
All reactions