Skip to content

Function call #15

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
alelordelo opened this issue Oct 17, 2024 · 2 comments
Open

Function call #15

alelordelo opened this issue Oct 17, 2024 · 2 comments

Comments

@alelordelo
Copy link

Is it possible to run function calls, like on OpenAI assistants api?

@alelordelo
Copy link
Author

I am new to Llama and MLX (currently using openAI Assistants API), so I did some research and found that Llama 3.2 can do function/tool calls:
https://www.llama.com/docs/model-cards-and-prompt-formats/llama3_2/

I did a first test with the prompt mentioned on the example above, and it seems to work. It also returns this info:

Should you decide to return the function call(s), Put it in the format of [func1(params_name=params_value, params_name2=params_value2...), func2(params)]
NO other text MUST be included.<|eot_id|><|start_header_id|>assistant<|end_header_id|>
Screenshot 2024-10-17 at 14 44 15

Now I am trying to figure out how to:

  • differentiate a tool call that a regular message
  • pass this function arguments and save to core data

If anyone has insights how to do this, I highly appreciate!

tks

@alelordelo
Copy link
Author

update, Jinja template now supports function calls:

ml-explore/mlx-swift-examples#174
johnmai-dev/Jinja#8

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant