Allow indexing with Ollama to use any model #5579
johnnyasantoss
started this conversation in
Feature Requests
Replies: 1 comment
-
i raised similar enquiry, i was trying using deepinfra |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
On the OpenAI compatible setup, Roo allows entering any model and any model dimension. It should be the same with Ollama.
I want to use
qwen3:0.6b
locally as it's shown (MTEB dash) to be much better than the current alternatives recommended by Roo on the setup page.Beta Was this translation helpful? Give feedback.
All reactions