Replies: 9 comments 10 replies
-
请问,你有任何新的想法吗?我在源文件中找到了‘PrefixEncoder’的类,似乎被用在了P-TuningV2里
使用在:
在这一过程中,官方是将这里的emmbedding作为past_key_values: Optional[Tuple[Tuple[torch.Tensor, torch.Tensor], ...]] = None |
Beta Was this translation helpful? Give feedback.
-
不确定,将跟算法同学进行讨论 |
Beta Was this translation helpful? Give feedback.
-
I'm trying to use GCG with ChatGLM3. After I read the code carefully, I think generate() actually supports inputs_embeds, which may solve the issue.
the parameter So in fact, to use Not sure if my understanding is correct? And I find, when run Not sure about my understanding, thanks a lot in advance for your support! |
Beta Was this translation helpful? Give feedback.
-
Following the code below does pass embedding as an input, but when using model.generate(), it will prompt an error:"You passed inputs = tokenizer(MutilTalk_Prompt,padding = 'max_length',max_length = 99)
tensor_input_ids = torch.tensor(inputs['input_ids']+[2])
tensor_input_ids = tensor_input_ids.cuda()
print(tensor_input_ids)
input_embeds = model.transformer.embedding(tensor_input_ids.unsqueeze(0))
outputs = model(input_ids=tensor_input_ids.unsqueeze(0),inputs_embeds=input_embeds)
logits_output = tokenizer.batch_decode(torch.argmax(outputs['logits'], -1).detach().cpu().numpy(), skip_special_tokens=True)
print(logits_output)
#error
outputs = model.generate(input_ids=tensor_input_ids.unsqueeze(0),inputs_embeds=input_embeds)
logits_output = tokenizer.batch_decode(torch.argmax(outputs['logits'], -1).detach().cpu().numpy(), skip_special_tokens=True)
print(logits_output) |
Beta Was this translation helpful? Give feedback.
-
Oh, I get what u mean now, actually I do not use |
Beta Was this translation helpful? Give feedback.
-
你们好,请问遇到这个 model.resize_token_embeddings 时报错:NotImplementedError后,是如何解决的呢?不使用吗?还是说要修改代码?本人不太会,希望能有大哥帮忙指导一下 |
Beta Was this translation helpful? Give feedback.
-
我也遇到这种问题,模型中没有set_input_embedding的接口,导致运行的时候报NotImplementedError,这个该怎么解决呀 |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
您的来信已收到,谢谢!陈雷同济大学测绘与地理信息学院Thanks for your attention.Chen Lei
College of survey and geo-information of Tongji university
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
我没看到具体generate方法代码,就先用prepare_inputs_for_generation分析。
如上图,llama的prepare_inputs_for_generation可以支持embedding输入,但是chatglm没有。
请问chatglm的generate方法是否不支持embedding输入?
如果理解错误,还望见谅。
@xunkai55 @davidlvxin @duzx16
Beta Was this translation helpful? Give feedback.
All reactions