How to use llama.cpp to generate answers in NLI #3379
Unanswered
ClaudiuCreanga
asked this question in
Q&A
Replies: 1 comment
-
You can use |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am looking to use llama.cpp to generate hypothesis from premise (NLI).
Say I have a premise "man is sitting on the couch". I want the model to generate a neutral hypothesis like "the keyboard is in english". Or a contradictory hypothesis: "the man is standing". I want it to do that for 10k examples.
I have llama.cpp working in interactive mode using:
But I don't think that's the best way for this task.
Beta Was this translation helpful? Give feedback.
All reactions