Replies: 2 comments
-
Yes, please refer "How can I convert my own model for use with Tabby?" in https://tabby.tabbyml.com/docs/faq/
You could either convert it by your self, or see if https://huggingface.co/TheBloke has converted it for you, for example https://huggingface.co/TheBloke/CodeLlama-70B-hf-GGUF/tree/main |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thank you very much. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have read #1230 this issue #1230 .
But I want to know if huggingface does not have the gguf format. How should I do?
(I want to try other models better than TabbyML/DeepseekCoder-6.7B, since DeepseekCoder-6.7B is still not the limit of my device)
Beta Was this translation helpful? Give feedback.
All reactions