Replies: 1 comment
-
Hello, @SenvenJ! I'm here to help you with any bugs, questions, or contributions you have. I'm looking forward to assisting you with your GitHub issue. To display all the reasoning information of the Agent, including the "Thought" part, on the web interface using import streamlit as st
from langchain import hub
from langchain.agents import AgentExecutor, create_react_agent, load_tools
from langchain_openai import OpenAI
from langchain_community.callbacks.streamlit import StreamlitCallbackHandler
llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
prompt = hub.pull("hwchase17/react")
agent = create_react_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
if prompt := st.chat_input():
st.chat_message("user").write(prompt)
with st.chat_message("assistant"):
st_callback = StreamlitCallbackHandler(
st.container(),
max_thought_containers=4,
expand_new_thoughts=True,
collapse_completed_thoughts=True
)
response = agent_executor.invoke(
{"input": prompt}, {"callbacks": [st_callback]}
)
st.write(response["output"]) This code sets up a Streamlit app that uses The |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
System Info
langchain==0.1.20
langchain-community==0.0.38
langchain-core ==0.1.52
langchain-experimental==0.0.49
langchain-text-splitters==0.0.2
langchainhub ==0.1.21
Beta Was this translation helpful? Give feedback.
All reactions