na #5309
Replies: 1 comment 1 reply
-
Hi @evancoe! I'm Dosu and I’m helping the Roo-Code team. Roo Code does not currently support specifying the LLM model as a dedicated field in the agent configuration JSON. Instead, model selection is managed through the Roo Code settings UI, where you pick the provider (like Anthropic, Ollama, LM Studio, LiteLLM, or the VS Code LM API) and then select or enter the model you want to use for that provider. This is done at the profile or provider level, not per-agent in the config file. You can set up multiple API profiles, each with its own model, and switch between them as needed—either in the settings or during a chat session. This is the officially supported and most reliable method right now; there’s no evidence in the docs that specifying the model in the agent config is supported or recommended at this time. Some integrations (like the VS Code LM API) are still experimental and depend on external extensions, so reliability can vary by provider and setup. For best results, use API profiles to manage different model configurations and experiment to find what works best for your use case To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Beta Was this translation helpful? Give feedback.
All reactions