Ollama Local LLM Support #9
Closed
Crashdowne
started this conversation in
General
Replies: 2 comments
-
Hi, yes Ollama support is next on the feature list! We have a new release coming out shortly with some but fixes then integrating Ollama is next! |
Beta Was this translation helpful? Give feedback.
0 replies
-
Ollama support is now incorporated: https://github.com/Academic-ID/sapienAI/releases/tag/v0.3.0 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Would it be possible to add Ollama support for local AI models in the future?
I realize that up to date research papers might still need a paid AI Model.
But for summarizing papers and potentially extraction, I think an option for a locally hosted option would be awesome. It would help keep some costs down (although the hardware for local hosting can be expensive)
Beta Was this translation helpful? Give feedback.
All reactions