Skip to content

Installing Ollama

Jeffry Samuel edited this page Apr 8, 2025 · 9 revisions

Flatpak

Alpaca used to include Ollama in it's Flatpak packages, this changed to make Ollama optional.

Flathub

Go to Alpaca's store page in your system's app store and look for the extension called Ollama Instance and install it, then reopen Alpaca and enjoy running local models!

You can also install the extension with this command:

# Check which installation type you have
flatpak list --columns=app,installation | grep Alpaca

# If you have a system installation
flatpak install com.jeffser.Alpaca.Plugins.Ollama

# If you have a user installation
flatpak install --user com.jeffser.Alpaca.Plugins.Ollama

AMD GPU Support

AMD GPUs require ROCm to be used with AI tools, Alpaca also packages it as an extension, so, in addition of com.jeffser.Alpaca.Plugins.Ollama you will also need to install com.jeffser.Alpaca.Plugins.AMD, available as an Alpaca extension in your system's app store as Alpaca AMD Support.

Arch Linux

Important

Alpaca doesn't support Arch Linux officially

Ollama is installed in Arch Linux the same way as any other package, the base ollama package is available in the stable repo, whilst there are other alternatives available in the AUR.

NixPkgs

Important

Alpaca doesn't support Nix officially

The Nix package is maintained by @Aleksanaa, any issues with the package should be reported at NixPkgs issues

Please read the installation instructions, there you can learn how to select between ollama-cuda and ollama-rocm.

Clone this wiki locally