How do you format with the prompt template in the server UI? #3829
Replies: 3 comments
-
Probably not a good direct answer, but I had success with using llama-cpp-python server module. CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python[server] export MODEL=“PATH_TO_YOUR_MODEL.gguf” And then you get openAI-like API on port 4891 that will correctly format your prompt as chatml |
Beta Was this translation helpful? Give feedback.
-
This is very much needed, I can't find any info either. |
Beta Was this translation helpful? Give feedback.
-
i would suggest to open an issue regarding this. especially because |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
The mistral finetunes are repeating unless I use the chatml format but I don't understand how to format it properly in the server UI. Cant find much documentation about the parameters in the ui like
{{name}}: {{message}} , {{prompt}},{{history}},{{char}}:
either. Any insight into this?Beta Was this translation helpful? Give feedback.
All reactions