Skip to content

Open WebUI support #199

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Sboshoff76 opened this issue Apr 19, 2025 · 1 comment
Open

Open WebUI support #199

Sboshoff76 opened this issue Apr 19, 2025 · 1 comment

Comments

@Sboshoff76
Copy link

Sboshoff76 commented Apr 19, 2025

Hi

I love the idea of this project. Well done on what you've done so far. Would an API (like ollama has) be something that can be incorporated into the root node?

EDIT: Just read the below bit in the medium doc. Will open WebUI connect to the root device on the same "ollama" port?

If you want to run the API service that supports the /v1/chat/completions endpoint, you should build the dllama-api application and run it on the root device instead dllama inference .

@unclemusclez
Copy link

it is OpenAI compatible it should work. I believe i have tried this months ago.

dllama-api is a separate executable. dllama is more like llama.cpp than ollama, if thats where you are coming from.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants