-
Checked other resources
Commit to Help
Example Codefrom langchain_community.chat_models import ChatOpenAI
llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)
from langchain.agents import tool
@tool
def get_word_length(word: str) -> int:
"""Returns the length of a word."""
return len(word)
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You are very powerful assistant, but don't know current events",
),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
llm_with_tools = llm.bind_tools(tools) DescriptionI follow this site https://python.langchain.com/docs/modules/agents/how_to/custom_agent try to make an custom agent, but get this error:
System Infolangchain==0.1.9 |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 9 replies
-
🤖 Hey @adream307, great to see you diving into the depths of LangChain again! 🌊 Based on the context provided, it seems like you're trying to use the The If you're trying to bind functions to the Here's an example of how you can use the llm = ChatOpenAI(openai_api_key="xxx",temperature=0.5)
tools = [get_band_packages]
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
system_msg
),
MessagesPlaceholder(variable_name="chat_history"),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
chat_history = []
functions = [format_tool_to_openai_function(t) for t in tools]
llm_with_tools = llm.bind_functions(functions=functions)
agent = (
{
"input": lambda x: x["input"],
"agent_scratchpad": lambda x: format_to_openai_function_messages(
x["intermediate_steps"]
),
"chat_history": lambda x: x["chat_history"],
}
| prompt
| llm_with_tools
| OpenAIFunctionsAgentOutputParser()
)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "xxxxxxx","chat_history":[]}) In this example, the Please try this approach and let me know if it resolves your issue. If you have any further questions or need additional clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
I encountered the same issue as you and resolved it in the following way: Firstly, I uninstalled the packages: |
Beta Was this translation helpful? Give feedback.
-
similar problem for llamacpp, koboldcppai, and ollama. |
Beta Was this translation helpful? Give feedback.
🤖
Hey @adream307, great to see you diving into the depths of LangChain again! 🌊
Based on the context provided, it seems like you're trying to use the
bind_tools
method with theChatOpenAI
class. However, as per the LangChain codebase and the issues found in the repository, there is nobind_tools
method available in theChatOpenAI
class.The
bind_tools
method is available in theChatMistralAI
class, which is a subclass ofBaseChatModel
. This method is designed to bind tool-like objects to the chat model, assuming the model is compatible with the OpenAI tool-calling API. You can find more details about this in theChatMistralAI
class definition in the LangChain codebase.If you're trying to…