streaming related code #28937
Closed
navaneethstabilix
announced in
Ask Dosu (Archived)
Replies: 1 comment
-
To add streaming to your code using from langgraph.checkpoint.memory import MemorySaver
from langgraph.prebuilt import create_react_agent
from langchain_anthropic import ChatAnthropic
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain_core.messages import HumanMessage
# Initialize memory saver
memory = MemorySaver()
# Create the language model and tools
model = ChatAnthropic(model_name="claude-3-sonnet-20240229")
search = TavilySearchResults(max_results=2)
tools = [search]
# Create the agent with memory
agent_executor = create_react_agent(model, tools, checkpointer=memory)
# Configuration with thread_id for memory
config = {"configurable": {"thread_id": "abc123"}}
# Stream messages with memory
for chunk in agent_executor.stream(
{"messages": [HumanMessage(content="hi im bob! and i live in sf")]}, config
):
print(chunk)
print("----")
for chunk in agent_executor.stream(
{"messages": [HumanMessage(content="whats the weather where I live?")]}, config
):
print(chunk)
print("----") This code sets up an agent with memory using To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
from langgraph.checkpoint.memory import MemorySaver
memory = MemorySaver()
react_graph_memory = builder.compile(checkpointer=memory)
how to add astream / stream for this code
Beta Was this translation helpful? Give feedback.
All reactions