Local LLMs
#173
Replies: 2 comments 2 replies
-
Hi @lilhoser , multi-model support is in PR right now - you should see it in < a week. @dluc, @dmytrostruk |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When will SK support offline/local LLMs instead of remote ones we must go through OpenAI to access?
Beta Was this translation helpful? Give feedback.
All reactions