Train from scratch - TXT chat prompt format? #6507
Unanswered
micheledellaguardia
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all, I'm starting training a new prompt/answer LLM with LLama.cpp but I'm not sure about the right format for train data to be used with --train-data.
I'm trying with something like this:
and giving
--sample-start "<s>" --include-sample-start
options to my command line.
Is it right or there's something wrong?
I saw that only two days ago the developers added chat formats to llama.cpp source.
Thank you all for your attention and big thanks to the developers
Michele
Beta Was this translation helpful? Give feedback.
All reactions