Support for configuring locally hosted LLM servers #220
amithkoujalgi
started this conversation in
Ideas
Replies: 1 comment
-
Hi All, If anyone has solution to this problem. Anyone has any pointers in that if there is a way to do this |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello.
First of all, thank you for creating this plugin. Its amazing! 🤩
I was wondering if we could have support for configuring locally hosted LLM servers in different ways. For instance, I can have an Ollama docker setup running locally with an exposed port. This way, it helps me isolate the LLM server and yet fully utilise it capabilities.
Let me know what you think. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions