You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I love the idea of this project. Well done on what you've done so far. Would an API (like ollama has) be something that can be incorporated into the root node?
EDIT: Just read the below bit in the medium doc. Will open WebUI connect to the root device on the same "ollama" port?
If you want to run the API service that supports the /v1/chat/completions endpoint, you should build the dllama-api application and run it on the root device instead dllama inference .
The text was updated successfully, but these errors were encountered:
Hi
I love the idea of this project. Well done on what you've done so far. Would an API (like ollama has) be something that can be incorporated into the root node?
EDIT: Just read the below bit in the medium doc. Will open WebUI connect to the root device on the same "ollama" port?
If you want to run the API service that supports the /v1/chat/completions endpoint, you should build the dllama-api application and run it on the root device instead dllama inference .
The text was updated successfully, but these errors were encountered: