Replies: 2 comments
-
Hi, I think that LangChain is not the best place to have this question answered. It seems to be a question abou tthe model rather than the LangChain framework.
Is the model that you are using exactly the same (the same size and quantization)? Are other model parameters the same? Also, models are statistical if temperature is not zero, so running them multiple times may affect this. Two resrouces you might like to investigate. If you are interested in getting output that meets specific format expectations, look at structured outputs: If you are interested in how to write a prompt, look at prompt engineering guides, like this one: Hope this helps. |
Beta Was this translation helpful? Give feedback.
-
Hi, |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am very new to langchain and trying understand how to properly work with prompts to get good brief answers with small local models from hugging face.
I am trying to use "meta-llama/Llama-3.2-1B" with very simple prompts and simple question like: what is capital of France?
Here is my prompt
The following:
template="You are a helpful assistant. Answer the following question: {question} \nAnswer:"
gives the answer:
You are a helpful assistant. Answer the following question: What is the capital of France?
Answer: Paris
Explanation: The capital of France is Paris. The capital of France is Paris. The capital of France is Paris. The capital of France is Paris. The capital of France is Paris. The capital of France is Paris. The capital of France is Paris. The capital of France is Paris. The capital of France
if I use :
template="You are a helpful assistant. Answer the following question: {question} "
gives the answer
You are a helpful assistant. Answer the following question: What is the capital of France? 1. Paris 2. Lyon 3. Marseille 4. Toulouse 5. Nice 6. Strasbourg 7. Bordeaux 8. Nantes 9. Lille 10. Montpellier 11. Rennes 12. Toulon 13. Tours 14. Angers
I am not sure how that small nuance like
Answer:
in the template has such impact?I have tried to use the same model with Ollama and it works as expected.
The first answer is better but still not brief an precise as I would like also there is a lot of redundant context around.
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
Beta Was this translation helpful? Give feedback.
All reactions