Replies: 6 comments 3 replies
-
We support a lot of different providers: Openai-compatible providers (like vllm) are supported, among others, via litellm |
Beta Was this translation helpful? Give feedback.
-
@yamijuan Did you figure out how to use openrouter? |
Beta Was this translation helpful? Give feedback.
-
if open-router (which i have not used) are providing openai-compatible mechanism, that it is fully supported with pr-agent something like:
|
Beta Was this translation helpful? Give feedback.
-
I've struggled with this a lot, reached a point where routing is being done to open router but litellm fails to send auth credentials |
Beta Was this translation helpful? Give feedback.
-
anyone know what config looks like if we use openrouter? |
Beta Was this translation helpful? Give feedback.
-
after some struggle, I can found the rootcause and create this PR: #1744 it should solved the auth error |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Is there any way or plan to add open router as model provider or be able to set up a custom model provider that's openapi compatible? thanks
Beta Was this translation helpful? Give feedback.
All reactions