unable to convert a model myself #80
Unanswered
maddes8cht
asked this question in
Q&A
Replies: 1 comment
-
Hi, Regarding the directory: It expects each model to be loaded from a directory |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am currently trying to convert a Falcon model myself.
I am starting with the original falcon-7b-instruct and falcon-40b-instruct to be sure that problems with the conversion are not caused by possible incompatibilities with a derived model.
I have successfully downloaded the model directory with
git clone https://huggingface.co/tiiuae/falcon-40b-instruct
andgit clone https://huggingface.co/tiiuae/falcon-7b-instruct
respectively and successfully performed the conversion withpython falcon_convert.py E:/Conversion/falcon-raw/falcon-40b-instruct E:/Conversion/falcon-out use-f32
on my windows system.But the quantization fails with this message:
It also does not work if I copy the tokenizer.json into the directory of the f32.bin file. Apparently the message says this file should be in a subdirectory with the same name as the f32.bin - but windows won't allow me a directory with the same name as a file in the directory.
Why do I even get the
old file format
message when I use a freshly converted file?Beta Was this translation helpful? Give feedback.
All reactions