-
I have a docker container setup which is running Ollama and Open Web UI. The Ollama and Open Web UI are in the same docker network and Ollama works with Open Web UI. Here is the docker-compose.yml file for this: services:
ollama:
container_name: ollama
image: ollama/ollama:latest
pull_policy: always
ports:
- "11434:11434"
volumes:
- ollama:/root/.ollama
restart: unless-stopped
tty: true
environment:
- OLLAMA_FORCE_CPU=1
- OLLAMA_NUM_THREADS=4
- OLLAMA_DEBUG=1
- OLLAMA_MAX_LOADED_MODELS=1
- OLLAMA_MAX_LOADED_MODELS_PER_USER=1
open-webui:
container_name: open-webui
image: ghcr.io/open-webui/open-webui:main
pull_policy: always
ports:
- "3000:8080"
depends_on:
- ollama
environment:
- OLLAMA_BASE_URL=http://ollama:11434
- HOST=0.0.0.0
volumes:
- open-webui:/app/backend/data
restart: unless-stopped
networks:
- default
extra_hosts:
- "host.docker.internal:host-gateway"
volumes:
ollama:
open-webui:
Since the Ollama instance port is exposed at 11434, in my terminal the command When I am trying to connect my Laravel App which is using Prism (and is running on a separate docker network) to this Ollama instance, I keep getting an error of:
I have tried several
Is there something special I need to do? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
This is not a Prism issue itself, rather it's a networking issue. |
Beta Was this translation helpful? Give feedback.
This is not a Prism issue itself, rather it's a networking issue.
Curl error shows that this url
http://localhost:11434/v1/api/chat
is not accessible, which means Ollama is not reachable in this host and portlocalhost:11434
.Try to use ip address:
curl http://CONTAINER_IP_ADDRESS:11434/v1/api/chat