-
Checked other resources
Commit to Help
Example Codellm = ChatLlamaCpp(
model_path="./Hermes-2-Pro-Llama-3-8B-Q8_0.gguf",
verbose=False,
repeat_penalty=1.5,
top_p=0.5,
max_tokens=512)
@tool("get_magic_function", args_schema=MagicFunctionInput)
def magic_function(magic_function_input: int):
"""Get the value of magic function for an input."""
return magic_function_input + 2
@tool("get_magic_function", args_schema=MagicFunctionInput)
def magic_function(magic_function_input: int):
"""Get the value of magic function for an input."""
return magic_function_input + 2
llm_with_tools = llm.bind_tools(
tools=[get_weather, magic_function],
tool_choice={"type": "function", "function": {
"name": "get_current_weather"}},
)
ai_msg = llm_with_tools.invoke(
"What is current weather of SF?"
) DescriptionIt seems that ChatLlamaCpp cannot be binded with a list of tools. I try to set the tool_choice as any, but it give me an error message. Is this because ChatLLlamaCpp does not support choice between tools yet? Anyone help? Thanks! System InfoFile "/Users/yichaod/.pyenv/versions/3.10.14/lib/python3.10/site-packages/langchain_community/chat_models/llamacpp.py", line 364, in bind_tools |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Hello, @evan-dayy! I'm here to help you with any bugs, questions, or contribution guidance you may need. Let's tackle this issue together! Yes, the Here's the relevant part of the def bind_tools(
self,
tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]],
*,
tool_choice: Optional[Union[Dict[str, Dict], bool, str]] = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, BaseMessage]:
"""Bind tool-like objects to this chat model
tool_choice: does not currently support "any", "auto" choices like OpenAI
tool-calling API. should be a dict of the form to force this tool
{"type": "function", "function": {"name": <<tool_name>>}}.
"""
... This means you need to specify the tool explicitly in the For example, if you want to use the llm_with_tools = llm.bind_tools(
tools=[get_weather, magic_function],
tool_choice={"type": "function", "function": {"name": "get_magic_function"}},
) This will force the model to use the |
Beta Was this translation helpful? Give feedback.
-
Any Workaround solution to bind multiple tools to llm and auto detect tools or follow the workflow?? llm = ChatLlamaCpp(.........) llm_with_tools = llm.bind_tools( |
Beta Was this translation helpful? Give feedback.
Hello, @evan-dayy! I'm here to help you with any bugs, questions, or contribution guidance you may need. Let's tackle this issue together!
Yes, the
ChatLlamaCpp
model currently does not support choosing between tools automatically. Thetool_choice
parameter does not support "any" or "auto" choices like the OpenAI tool-calling API. You need to specify a specific tool to force its use.Here's the relevant part of the
bind_tools
method: