Replies: 1 comment
-
Any suggestions on this? @ekzhu |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a groupchat consisting of 3 agents with OpenRouter LLMs behind 2 of them and OpenAI behind one of them.
I have often observed that randomly one of the Agents (99% powered by OpenRouter LLM) ends up giving this error in between an ongoing groupchat:
Any suggestions on how to handle this? I have tried adding fallback on on_messages but I guess this is happening even before it reaches on_messages. I am not sure how this can be handled permanently, especially considering my dependency on OpenRouter LLMs.
Just FYI, I am also setting structured_output and json_output as False in LLM config. But still getting this.
Autogen version:
Beta Was this translation helpful? Give feedback.
All reactions