how to disable built-in chat template for llama-server #11119
Unanswered
RunningLeon
asked this question in
Q&A
Replies: 2 comments
-
I too really wish there were a way to not use the chat template and disable the default template. I have to use very old versions of llama.cpp in order to avoid this behavior and use models not meant for completion mode in raw completion mode. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
It seems the built-in chat template is always applied on prompts, how to disable it ? THX
Beta Was this translation helpful? Give feedback.
All reactions