模型Embed层问题
#434
Replies: 1 comment 3 replies
-
该问题需要修改Huggingface 的tokenizer代码,#436 中有更多相关的内容,可以吧各种这类问题都在这个讨论区进行贡献,感谢支持 |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
System Info / 系統信息
def set_input_embeddings(self, value: nn.Module):
"""
Set model's input embeddings.
PreTrainedModel.set_input_embeddings内的问题,首次self: ChatGLMForConditionalGeneration,获取base_model: ChatGLMModel,因为ChatGLMModel内无set_input_embeddings导致再次跳入PreTrainedModel.set_input_embeddings,导致走到 raise NotImplementedError
Reproduction / 复现过程
model = AutoModel.from_pretrained("THUDM/chatglm3-6b", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm3-6b", trust_remote_code=True)
model.resize_token_embeddings(len(tokenizer)) # 崩溃
Expected behavior / 期待表现
如果不准备提供 resize_token_embeddings 能力,建议给个提示。
Beta Was this translation helpful? Give feedback.
All reactions