Replies: 1 comment
-
same question here |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Hi,
I’m using Ollama to upload a custom model and integrating it with LangChain. When I create the model in Ollama using the command:
I specify a system prompt and some parameters in the Modelfile. I’d like to know:
ollama create {model_name} -f Modelfile
Do the system prompt and parameters in the Modelfile persist when the model is used in LangChain?
Or do I need to redefine the system prompt and parameters separately when using the model with LangChain?
Thank you for your help!
System Info
Beta Was this translation helpful? Give feedback.
All reactions