Use vLLM with custom LLM for classification #12176
power-puff-gg
announced in
Q&A
Replies: 1 comment
-
You can register external models to vLLM according to this page |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
Is there any way to use vLLM with a custom LLM? It is basically a model modified to add more classification heads as the intention is to use this for a classification task.
Beta Was this translation helpful? Give feedback.
All reactions