Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ui-ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:
- name: Checkout UI repository
uses: actions/checkout@v4
with:
repository: absmach/magistrala-ui-new
repository: absmach/magistrala-ui
path: ui
token: ${{ secrets.GITHUBPAT }}

Expand Down
4 changes: 3 additions & 1 deletion docker/.env
Original file line number Diff line number Diff line change
Expand Up @@ -211,4 +211,6 @@ UV_CUBE_UI_DOCKER_ACCEPT_EULA=yes
#vllm
NVIDIA_VISIBLE_DEVICES=all
NVIDIA_DRIVER_CAPABILITIES=compute,utility
VLLM_LOGGING_LEVEL=INFO
VLLM_LOGGING_LEVEL=INFO
OLLAMA_BASE_URL=http://cube-proxy:8900
OLLAMA_DEFAULT_MODEL=tinyllama:1.1b
Comment on lines +215 to +216
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rename these variables, we support both ollama and vllm, see make file. llm target for ui should just be proxy url, which you can name, also build this from existing variables

Comment on lines +215 to +216
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rename these variables, we support both ollama and vllm, see make file. llm target for ui should just be proxy url, which you can name, also build this from existing variables

14 changes: 0 additions & 14 deletions docker/ollama-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,20 +35,6 @@ services:
# capabilities:
# - gpu

open-webui:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

check vllm file as well

profiles: ["ollama", "default"]
container_name: open-webui
image: ghcr.io/open-webui/open-webui:0.3.32-ollama
restart: unless-stopped
volumes:
- open-webui:/app/backend/data
ports:
- 3000:8080
environment:
- OLLAMA_BASE_URL=http://ollama:11434
networks:
- cube-network

pull-tinyllama:
profiles: ["ollama", "default"]
image: docker:27.3.1
Expand Down
2 changes: 2 additions & 0 deletions docker/supermq-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -291,6 +291,8 @@ services:
NEXTAUTH_SECRET: ${UV_CUBE_UI_NEXTAUTH_SECRET}
NEXTAUTH_URL: ${UV_CUBE_NEXTAUTH_URL}
NODE_ENV: ${UV_CUBE_NODE_ENV}
OLLAMA_BASE_URL: ${OLLAMA_BASE_URL}
OLLAMA_DEFAULT_MODEL: ${OLLAMA_DEFAULT_MODEL}
Comment on lines +294 to +295
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use more appropriate variables


domains-db:
image: postgres:16.2-alpine
Expand Down