Installation of fabric in a python3 virtualenvironment and then installing ollama with local LLM:s #195
Replies: 1 comment 2 replies
-
Thanks for this. It all worked up to running the "listmodels" command.
I have the models: Ollama is listening locally: NB: in your code you have an extra "1" at the end of the port. and this is my config: Any clue on what I might be missing would be highly appreciated, thanks! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everybody, I just want to show how I got ollama working with fabric.
Installation of fabric in a python3 virtualenvironment and then installing ollama with local LLM:s
Virtualenvironments to prevent problems with other python projects
Install fabric:
Istalling ollama:
Download client from ollama.com and install it.
In terminal:
Check if fabric also can "see" the local models
Try it by copy some text and then
Beta Was this translation helpful? Give feedback.
All reactions