Added LM Studio Support to Config File/llm_wrapper #27
JonathanMitchell1234
started this conversation in
Ideas
Replies: 1 comment 3 replies
-
I think PR #1 already has that covered, and so does another that is still open. No need to reinvent the wheel, clearly there is a lot of interest in using a general purpose openAI compatible endpoint config. Hopefully we can settle on a solution soon |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
@TheBlewish I added some code to the config file and the llm_wrapper to support LM Studio generated endpoints, which is working (I already had LM studio installed and I didn't feel like redownloading Ollama lol so I figured I'd just make modifications) If you're interested in supporting it with ollama, let me know and I'll send a pull request.
Beta Was this translation helpful? Give feedback.
All reactions