Replies: 1 comment
-
Hi, that should already work. It's not included by default, because this is a bit of an opinionated package to show the possibilities with self-hosted AI models. You can easily add LM Studio through their OpenAI-compatible API (use a new credential on OpenAI node, and specify the base-URL of your LM Studio). If you need more help with that, feel free to jump into the community forum! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Please support LM Studio (for MLX Apple silicon support)
Beta Was this translation helpful? Give feedback.
All reactions