Replies: 1 comment
-
My bad, it works. It was something upstream I was messing up. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I hosted a model with
vllm.entrypoints.openai.api_server
. I tried to query it with AyncOpenAI() client instead of OpenAI() but I got the error that base_url is not a supported parameter. How do I run a large batch of requests Async using vllm?Beta Was this translation helpful? Give feedback.
All reactions