tried using ollama; it works #1013
Replies: 3 comments 2 replies
-
|
Thank you very much, I'm happy to include these updates in wiki. |
Beta Was this translation helpful? Give feedback.
-
|
Thank you for creating this awesome piece of software! Happy to help 👍 Here is the error message I get when using |
Beta Was this translation helpful? Give feedback.
-
|
Thank you. It seems that the Ollama OpenAI-API doesn't accept an empty string. I will mention that in the wiki. I think I'm going to make a separate section about Ollama based on your settings. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I had some issues in the beginning setting up the
config.inifile, but found their causes after some experimenting. I don't know how to create a pull request, but here is a slightly changed section of the AI setup page of the wiki that should help other people set up theirconfig.inifor ollama quicker:Using Other AI Models or GPT-4 via Microsoft Azure
If you have access to other Large Language Models—on cloud platforms, university servers, or even your own PC—you can try integrating them into QualCoder. This, however, is a highly experimental option and has not been very well tested. We would be interested to hear about your experiences.
The section name must start with the prefix 'ai_model_'.
Again, if you get other services running in QualCoder (or if you tried but failed), we would like to hear about your endeavors.
Summary of suggested changes:
/at the endapi_key = None, at least for me, caused errors downstream (presumably because of how these openAI libraries handle NoneType object). Anyrandom_stringworks - ollama just ignores it. Maybe this issue withNoneis not limited to ollama. I couldn't test it.Beta Was this translation helpful? Give feedback.
All reactions