Replies: 1 comment
-
I am also trying to create an agent with the LLama2-7b-chat-hf model locally.
If I tweak the prompt a little bit, the whole output changes. Sometimes, the agent called the wiki tool. Sometimes, it didn't call any tools and generated the Final Answer directly. Sometimes, it looped infinitely. I also don't know how to fix it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am able to use the agent together with serpapi and OpenAI.
But I have a problem doing the same with a local model MTP-7b-chat with HuggingFacePipeline.
Please help.
CS
My codes
from langchain.agents import load_tools
from langchain.agents import initialize_agent
import json
query="Get Microsoft share price from the www.bloomberg.com and include the url where you got this information."
llm = HuggingFacePipeline(pipeline=generate_text)
toolkit = load_tools(["serpapi"], llm=llm, serpapi_api_key=SERPAPI_API_KEY)
agent = initialize_agent(toolkit, llm, agent="zero-shot-react-description", verbose=True, return_intermediate_steps=True)
response = agent({"input":query})
Verbose message
Beta Was this translation helpful? Give feedback.
All reactions