Replies: 1 comment 12 replies
-
If the architecture is the same, you can try adding it to the annotation. |
Beta Was this translation helpful? Give feedback.
12 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to generate a GGUF formatted Roberta style model. However, both the base Roberta model and a masked language model variant (
*-mlm
) raise errors when running frommain
on Roberta.What I've tried:
(1) The Hf -> gguf convert your repo browser option which raises errors (see screenshots)
(2) The convert python module from main branch-after downloading a Roberta model from hugginface to my local machine. When I try to do the conversion to GGUF this raises the same error as the converter in (1).
I'd like to be able to convert a Roberta model into a GGUF format so that it can be used in the
lembed
sqlite extension which requires the model to be in the GGUF format locally.Note: If there are other ways to enable the conversion I'm happy to try that-please let me know the preferred conversion method if my current approach isn't preferred.
From spelunking the transformers code I see that the architecture is ostensibly the same for XLMRoberta as for Roberta (ref https://github.com/huggingface/transformers/blob/bdb29ff9f3b8030772bd4be037d061f253c0e928/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py#L152)
So given that xlm roberta model is supported I'd like to be try using the same conversion for a Roberta model, is there a way to achieve conversion for a roberta model using the xlm roberta code?
some related items:
PR 5423 seems to have added BERT model support
discussion 7712 shows how to generate gguf for other, supported, model types.
discussion 2948 is a, possibly outdated-tutorial from 2023 on conversion to gguf format.
PR 8205 adds some documentation for BERT support.
Beta Was this translation helpful? Give feedback.
All reactions