Do I need to provide the "chat template" or "prompt format" to llamafile ? #650
Unanswered
edouardklein
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all !
I'm using llamafile, as it's the only project that I could compile with AMD GPU support, and because @jart and the other contributors did a wonderful work and this stuff Just Works.
Now, I have trouble with the prompt format.
I have read:
and give specific information and examples.
I also have run some tests, e.g.
v.s.
In this particular case, the version with the prompt format elements yielded the perfect answer to my go-to question ("What would you say are the 4 most important concepts defined and used by Pierre Bourdieu ?"), whereas the second version, without the prompt format yielded a close, but not perfect answer (it forgot "symbolic violence").
So I'm basically lost:
To make matters worse, when I use a web ui on top of llamafile, I get the <|eot_id|> string at the end of each response (#630 ) and I don't know if it is a related problem.
If anybody knows what's what, or even have the slightest idea of what's going on, I'd love to hear it !
Thanks :)
Beta Was this translation helpful? Give feedback.
All reactions