Does work with openrouter.ai? #104
Unanswered
almagest21
asked this question in
Q&A
Replies: 1 comment 3 replies
-
I've tested azure openai gpt-4o-mini and it works fine using streaming mode (using config you provided). I don't get choppy responses. Perhaps worth contacting openrouter to address the issue. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks for this brilliant project.
It was very difficult to determine where the problem was coming from, but I would like to report it.
I experienced this problem when used with OpenRouter.ai's OpenAI compatible endpoint (https://openrouter.ai/api/v1/chat/completions).
As shown below, I receive choppy reception with missing characters.
If the streaming option is turned off, it appears normal.
(When adding Custom Models, the [Enable streaming for real-time response generation] option)
Could you please confirm this problem?
Beta Was this translation helpful? Give feedback.
All reactions