Trying to using functions with VLLM #8695
joestein-ssc
announced in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I followed the directions here https://github.com/vllm-project/vllm/blob/main/docs/source/serving/openai_compatible_server.md#tool-calling-in-the-chat-completion-api-1 but now I am getting this error. I am using the Llama3.1-70b-instruct model.
My code.
The error
Beta Was this translation helpful? Give feedback.
All reactions