Replies: 1 comment 12 replies
-
Tried a few other models ... here's the results: gemini-2.5-pro - Worked Using, for example claude-sonnet-4 directly with anthropic adapter, works just fine. So guess there's something wrong on how codecompanion handles the llm answers via copilot? |
Beta Was this translation helpful? Give feedback.
12 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My apologies if this has been discussed before, but I my "search fu" may not be strong enough to find something relevant.
As stated in the title, sometimes codecompanion does not return the full answer from the LLM and cuts it somewhere halfway through.
This used to happen often in certain models with copilot. But then it stopped happening. Now it seems to be back again.
In this case now it happened using copilot with cloud-sonnet-4 ... the logs don't actually provide anything that seems useful to understand what's going on ...
I've seen this also happening with the anthropic adapter before ...
Beta Was this translation helpful? Give feedback.
All reactions