Replies: 2 comments 1 reply
-
You use the old repo. Ollama is available in our Repo but Not on our Website. |
Beta Was this translation helpful? Give feedback.
-
Update and something that might help the next guy: Thank you @MickLesk for the answer, I found Ollama in the new repo here: 🚀 Ollama setup has been successfully initialized! yet. that is not the actually the port that was used, instead it was: http://192.168.178.118:11434/ |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
TL;DR: I’m trying to connect PaperlessAI and PaperlessGPT to a local LLM on my server, but the provided Ollama LXC setup script was archived on Nov 2, 2024. What is the current alternative for setting up Ollama with PaperlessAI/PaperlessGPT? Is it possible to configure an Ollama API using Open WebUI, if yes, how?
Hello everyone,
I’m currently trying to set up PaperlessNGX on my server, which I was able to do successfully using the provided setup script. After this, I wanted to integrate PaperlessAI and/or PaperlessGPT, which require connecting to an LLM model. This can either be done via the ChatGPT API or through a local LLM.
To connect to a local LLM, I need to set up an Ollama API URL. I found that the helper script previously used to configure an Ollama LXC container here: https://github.com/tteck/Proxmox/blob/main/ct/ollama.sh
was archived on November 2, 2024. Since I didn’t use the helper scripts before, I’m wondering if this is due to the addition of Open WebUI, which may have made the Ollama LXC setup redundant.
However, I’m struggling to figure out how to obtain an API for Ollama from Open WebUI, as it’s not very clear or straightforward.
Additionally, I checked the source of the script here: https://raw.githubusercontent.com/tteck/Proxmox/main/misc/build.func
and noticed it’s designed for Proxmox versions 8.1-8.3. Since I’m using a newer version 8.4, the script can’t be run as-is and would need modifications. I’m unsure if there are other changes required besides editing for compatibility/security with the newer Proxmox version.
Could anyone provide guidance on the current alternative for setting up Ollama with PaperlessAI/PaperlessGPT? Also, is it still possible to configure an Ollama API using Open WebUI, or is there another recommended approach to achieve this?
I just wanted to make sure before I manually create an Ollama LXC
Thank you in advance for your help!
Beta Was this translation helpful? Give feedback.
All reactions