Skip to content

Help with model output inconsistencies? #4058

Answered by KerfuffleV2
AoifeHughes asked this question in Q&A
Discussion options

You must be logged in to vote

The first trailing newline in a prompt is stripped off. So if you want an actual newline in the prompt, try adding two. In other words, if your prompt is:

This is my prompt. <-- newline here

and you want that newline, then you want the file to look like:

This is my prompt. <-- newline
<-- newline

Also, I'm not really familiar with Ollama but you're not using the correct LLaMA2-chat instruction format. Maybe Ollama is fixing that for you somehow behind the scenes. If I remember correctly, it's supposed be like:

[INST]Do the thing.
And do the other thing too.[/INST]

You can sometimes get away with violating the prompt format the model was trained on but sometimes it can make a pretty big…

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@AoifeHughes
Comment options

@AoifeHughes
Comment options

@KerfuffleV2
Comment options

Answer selected by AoifeHughes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants