Would it be possible to use a locally run LLM instead of Codeium, e.g. via Ollama? #784
gitwittidbit
started this conversation in
General
Replies: 1 comment 5 replies
-
Interesting! The codeium integration is currently achieved using: Monaco editor: Monaco Codeium Provider, which is a fork of the codeium official codeium-react-code-editor. It allows you to configure the languageServer URL. CodeMirror: codemirror-codeium, which although it does not allow to configure the URL out-of-the-box, this can be easily achieved. So to use another backend server (including a local one), you will have to provide the same API and change the URL to it. Thank you. |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I would rather not have my code uploaded to any 3rd party. And I am already running various LLMs locally.
Beta Was this translation helpful? Give feedback.
All reactions