You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a couple of adventure and novel writing models which when I try to convert using convert_hf_to_gguf.py I get one of the following errors:
ERROR:hf-to-gguf:Model XGLMForCausalLM is not supported
ERROR:hf-to-gguf:Model GPTJForCausalLM is not supported
ERROR:hf-to-gguf:Model OPTForCausalLM is not supported
ERROR:hf-to-gguf:Model GPTJForCausalLM is not supported
I am pretty sure they should be convertable. If it is of any relevance they are from KoboldAI, and I'd like to convert them to GGUF to try importing them to Ollama for using with some better UI.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I have a couple of adventure and novel writing models which when I try to convert using
convert_hf_to_gguf.py
I get one of the following errors:ERROR:hf-to-gguf:Model XGLMForCausalLM is not supported
ERROR:hf-to-gguf:Model GPTJForCausalLM is not supported
ERROR:hf-to-gguf:Model OPTForCausalLM is not supported
ERROR:hf-to-gguf:Model GPTJForCausalLM is not supported
I am pretty sure they should be convertable. If it is of any relevance they are from KoboldAI, and I'd like to convert them to GGUF to try importing them to Ollama for using with some better UI.
Any help / advice is appreciated.
Beta Was this translation helpful? Give feedback.
All reactions