You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RuntimeError: Error(s) in loading state_dict for PeftModelForCausalLM:
size mismatch for base_model.model.model.embed_tokens.modules_to_save.default.weight: copying a param with shape torch.Size([128320, 4096]) from checkpoint, the shape in current model is torch.Size([128256, 4096]).
size mismatch for base_model.model.lm_head.modules_to_save.default.weight: copying a param with shape torch.Size([128320, 4096]) from checkpoint, the shape in current model is torch.Size([128256, 4096]).
hiyouga
added
solved
This problem has been already solved
and removed
bug
Something isn't working
pending
This problem is yet to be addressed
labels
May 9, 2025
Uh oh!
There was an error while loading. Please reload this page.
Reminder
System Info
llamafactory
version: 0.9.3.dev0Reproduction
问题描述
在lora sft训练中,如果使用
add_tokens
参数,则会出现:adapter_config.json
: "modules_to_save": ["lm_head","embed_tokens"]ValueError: vLLM only supports modules_to_save being None.
不使用
add_tokens
参数则一切正常。在之前版本使用
special_token
的时候也一切正常。#4139
相关问题:
另附推理配置文件
Others
No response
The text was updated successfully, but these errors were encountered: