Supporting Adapters in CodeCompanion #800
Replies: 2 comments
-
I'm getting a number of request to add support for additional adapters in the plugin and I should have been more explicit in the post above. If an adapter can leverage the OpenAI adapter, then I will not be accepting a PR for it. Instead, I would ask the community to share it in the discussions and then we can link to it in the documentation. |
Beta Was this translation helpful? Give feedback.
-
I've decided to stop supporting the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I wanted to share my thoughts regarding supporting additional adapters in CodeCompanion.
As the plugin continues to grow in popularity, so do the requests for adding support for additional LLMs and adapters. This is a good thing and to be expected. After-all, with every week we have new entrants to the LLM world which grab the headlines.
However, it's impossible to support every LLM in the plugin. Doing so will add additional maintenance with potentially not a whole lot of gain for the wider user base. Anticipating this 10+ months ago, I introduced a flexible adapter system and created the guide on making your own Adapters.
My stance
As of today, I'll be happy to accept PRs for any LLMs that sit on the first page of the ProLLM Leaderboard.
However, this assumes:
ollama
adapter and is hosted by a third-partyopenai
adapterThere will no doubt be exceptions to this rule such as with Copilot. In those cases, let's open up a discussion.
Finally, if you know of a better resource than the ProLLM dashboard, please let me know.
Beta Was this translation helpful? Give feedback.
All reactions