p-tuning v2微调以后生成的模型大小有问题,求教 #1147
-
p-tuning v2微调以后生成的checkpoints-500大小就有16G,这是为什么啊?我用lora微调整个才141M |
Beta Was this translation helpful? Give feedback.
Answered by
zRzRzRzRzRzRzR
Apr 20, 2024
Replies: 2 comments
-
我发现微调以后p-tuning v2保存的是合并后的模型,而lora保存的是合并前的参数?这是为什么啊? |
Beta Was this translation helpful? Give feedback.
0 replies
-
ptuing是完整的整个模型,用了全参的办法 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
zRzRzRzRzRzRzR
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
ptuing是完整的整个模型,用了全参的办法