How to generate multiple answers in LLAMA.ccp? #4467
Unanswered
Hyperplane2021
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have only recently started to experiment with the LLAMA2 model. For example, in the API of GPT3.5, we can use the parameter n to adjust the number of outputs. How to do this in LLama.cpp?
Specifically.
response=lcpp_llm(prompt=prompt2, max_tokens=256, temperature=0.9, top_p=1, repeat_penalty=1.2, top_k=150, logprobs=5, echo=False,)
How to get :
response["choices"][1]
or more, instead of onlyresponse["choices"][0]
Beta Was this translation helpful? Give feedback.
All reactions