Replies: 2 comments 3 replies
-
Maybe remove the last backslash? |
Beta Was this translation helpful? Give feedback.
2 replies
-
By default, Ollama listens to local connections only. You need to run it with OLLAMA_HOST=0.0.0.0 ollama run llama3.2 or set it in the settings. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I tried to connect ollama, but i cant get it working.
Ive installed it on ubuntu 20.04.6. if i put the ip with port 11434 in my browser, its telling me "Ollama is running"
After i do "ollama run llama3.2" it starts and is avaiable via terminal.
If i put it on paperless ai, it says

ive downloadet llama3.2 3b.
Any tutorial maybe?
Beta Was this translation helpful? Give feedback.
All reactions