Replies: 3 comments
-
Please include more details about which provider you are using, e.g Ollama and the exact model you are trying to use. Also please include details for the Thanks. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Its Ollama wizard-vicuna-uncensored: https://ollama.com/library/wizard-vicuna-uncensored. Sorry I dont know how to get the chat.hbs template |
Beta Was this translation helpful? Give feedback.
0 replies
-
It should work by default now using Ollama as twinny now supports OpenAI specification. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I'd like to use wizard-vicuna instead of codellama. Is this feasible? (I have limited disk space and cant afford a separate coding optimized model..)
I tried changing the model and i get the error "Sorry I dont understand. Please try again" and a spinning indicator
I've changed to
Beta Was this translation helpful? Give feedback.
All reactions