-
Hello, I would like to request your help. I’m trying to finetune the following model: TheBloke/deepseek-coder-1.3b-base-GGUF deepseek-coder-1.3b-base.Q8_0.gguf. I ran the the finetuning using the following command:
After the finetuning, I ran the model with the lora adapter using the following command:
|
Beta Was this translation helpful? Give feedback.
Answered by
slaren
Dec 21, 2023
Replies: 1 comment 1 reply
-
I don't know if finetuning deepseeker works, but this error is already fixed on master. |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
m1chae1bx
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I don't know if finetuning deepseeker works, but this error is already fixed on master.