UOS-AI support to use ollama local models #7714
Replies: 3 comments
-
cc @meiyixiang |
Beta Was this translation helpful? Give feedback.
-
Also, why not use vLLM as it's more efficiency and more hardware-friendly, or support openvino? |
Beta Was this translation helpful? Give feedback.
-
You definitely can use Ollama with UOS, use the following address http://localhost:11434/v1 I have been using it for months. 🙂 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
It would be great if functionality was added to uos-ai to use local Ollama models. In this way, those people who do not have internet or have the AI pages blocked at work (as is my case) can use AI in Deepin.
Beta Was this translation helpful? Give feedback.
All reactions