-
I encountered some load model problems. I am using llama-cli on aws ubuntu22.04 and want to load the llama model. My model is stored in the models folder. No matter whether I use llama 3.1 8b 2q_k.gguf or llama 2 7b 2q_k.gguf, it cannot load the model and display the following question, Is my command error or models problem? thx |
Beta Was this translation helpful? Give feedback.
Answered by
ngxson
Aug 15, 2024
Replies: 1 comment
-
Your model file is corrupted (i.e. the download link is incorrect), you must re-download it. Also try |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
zheng48
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Your model file is corrupted (i.e. the download link is incorrect), you must re-download it.
Also try
ls -la
, you will see your current model is only 1KB