Issue with LLM Configuration Not Reflecting in Agent Orchestrator #15909
Unanswered
sowjanyagunupuru
asked this question in
Q&A
Replies: 1 comment
-
Hi, which version of Theia are you using? I tried to reproduce this, but failed (I used 1.63) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I’m encountering an issue while configuring the LLM for the agent orchestrator.
After selecting the desired LLM in the AI Configuration View, the change is not reflected in the settings.json file (I have checked both relevant settings.json files). As a workaround, I tried manually setting the identifier in the file.
Despite this, in the Chat View, the orchestrator agent still prompts me for the Anthropic API key. Even when the change does reflect in one of the settings.json files, the same issue persists in the Chat View.
This behavior is clearly illustrated in the attached image.
Any suggestions or solutions to help resolve this issue would be greatly appreciated, as it would allow me to proceed further.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions