how to stream output messages only #28940
navaneethstabilix
announced in
Ask Dosu (Archived)
Replies: 1 comment
-
To stream only the output messages from the model using your asynchronous generator code, you can modify the async def generate_chat_responses(message):
async for chunk in react_graph_memory.astream(
{"messages": [("user", message)]},
config=config,
stream_mode="messages"
):
# Assuming chunk is a dictionary with a key "content" for the message text
message_content = chunk["message"]["content"]
yield message_content
# To consume this asynchronous generator
async def main():
async for response in generate_chat_responses("what is the weather here"):
print(response)
# Run in an async environment (e.g., Jupyter or Python script)
import asyncio
await main() This code assumes that each To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
config = {"configurable": {"thread_id": "384156"}}
async def generate_chat_responses(message):
full_response = " "
async for chunk in react_graph_memory.astream(
{"messages": [("user", message)]}
,config=config,stream_mode="messages"
):
print("haha",chunk)
# yield f"{chunk}\n\n"
yield chunk
To consume this asynchronous generator
async def main():
async for response in generate_chat_responses("what is the weather here"):
print(response)
Run in an async environment (e.g., Jupyter or Python script)
import asyncio
await main()
how do i stream outpur message the rresponse from model onyl for this case
Beta Was this translation helpful? Give feedback.
All reactions