Replies: 1 comment
-
Copy paste that suggested format into a text file, remove the "..." at the end, replace "Implement a linked list in C++" with your question, and use that as a prompt. You can tell That's all for a single run mode. If you want to use interactive mode, asking more than one question in a single "session", frankly, it's easier to use |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
When running llama.cpp server with a GGUF model, is it needed to specify the instruction format?
If so, can someone explain me how to do so?
For example, for the model Phind-CodeLlama-34B-v2, the following instruction format is suggested:
Where can I specify this?
NB: I'm not talking about specifying an output format or any grammar. I'm talking about formatting the prompt so that it fits what the model expects.
Beta Was this translation helpful? Give feedback.
All reactions