-
-
Notifications
You must be signed in to change notification settings - Fork 80
Installing Ollama
Jeffry Samuel edited this page Mar 7, 2025
·
9 revisions
Alpaca used to include Ollama in it's Flatpak packages, this changed to make Ollama optional.
Go to Alpaca's store page in your system's app store and look for the extension called Ollama Instance
and install it, then reopen Alpaca and enjoy running local models!
You can also install the extension with this command:
# Check which installation type you have
flatpak list --columns=app,installation | grep Alpaca
# If you have a system installation
flatpak install com.jeffser.Alpaca.Plugins.Ollama
# If you have a user installation
flatpak install --user com.jeffser.Alpaca.Plugins.Ollama
AMD GPUs require ROCm to be used with AI tools, Alpaca also packages it as an extension, so, in addition of com.jeffser.Alpaca.Plugins.Ollama
you will also need to install com.jeffser.Alpaca.Plugins.AMD
, available as an Alpaca extension in your system's app store as Alpaca AMD Support
.