-
Checked other resources
Commit to Help
Example Codefrom langchain_ollama import ChatOllama
llm = ChatOllama(
model="mistral-nemo",
temperature=0,
base_url="http://[server_ip]:11434"
)
llm.invoke("Hello, how are you?") DescriptionWhen trying to use the System InfoSystem Information
Package Information
Packages not installed (Not Necessarily a Problem)The following packages were not found:
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 6 replies
-
Hey there, @jzalger! I'm here to help you with any bugs, questions, or contributions you might have. Let's tackle this issue together! To specify a remote server URL in the Here is an example of how to set the environment variable within your Python script: import os
# Set the remote server URL
os.environ["LLAMAFILE_SERVER_BASE_URL"] = "http://[server_ip]:11434"
# Now you can use the Llamafile class as usual
from langchain_community.llms.llamafile import Llamafile
llm = Llamafile()
output = llm.invoke("Hello, how are you?")
print(output) Make sure to replace Additionally, there is a known issue with the |
Beta Was this translation helpful? Give feedback.
-
For anyone struggling with this, I dug into the code a bit and found the correct env variable is You don't need the |
Beta Was this translation helpful? Give feedback.
-
I noticed this in the code - and it seems to be working for me if I initialize my ChatOllama instance with a "base_url" argument (not server_url) While changing the environment variable might be convenient for some, this makes it difficult for one to communicate with several ollama servers at once. My electric bill will never forgive me! |
Beta Was this translation helpful? Give feedback.
For anyone struggling with this, I dug into the code a bit and found the correct env variable is
OLLAMA_HOST
and is set to the full URL including the port. Eg (http://SERVER_IP:11434
).You don't need the
base_url
param like in the community Ollama packages.