Inferencing on Fine Tuned Mistral 7B #2085
Manivarsh-adi
announced in
General
Replies: 1 comment
-
Absolutely. Currently vllm supports Mistral models. As long as the model is fine-tuned on the Mistral base, it can be used directly. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I have fine tuned mIstral 7B on a specific data and want to inference using vLLM module, is it possible?
Beta Was this translation helpful? Give feedback.
All reactions