Replies: 1 comment 1 reply
-
It's not supported yet and being tracked by #7528. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
In vLLM v0.5.5, does --num-scheduler-steps not work with --enable-lora?
I'm serving lora adapters via --enable-lora.
But when I set --num-scheduler-steps=8 and reason with adapter, the answer generated is the answer from the base model.
In vLLM v0.5.5, --num-scheduler-steps cannot be mixed with --enable-lora?
Beta Was this translation helpful? Give feedback.
All reactions