Quantization support #856
S-a-n-k-e-t-1998
announced in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Vllm is support to quantization while model deployment?
Beta Was this translation helpful? Give feedback.
All reactions