Skip to content

Conversation

hot-zhy
Copy link
Contributor

@hot-zhy hot-zhy commented Sep 1, 2024

No description provided.

from mindnlp.peft.utils.integrations import dequantize_module_weight
from ...utils.other import transpose
import mindspore
from mindspore import ops
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

都用mindnlp.core.ops

from mindspore import Parameter
from mindnlp.core import ops

def dequantize_module_weight(module: nn.Module) -> nn.Parameter:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这部分用不到,不用加

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改~

tiktoken
faiss_cpu
phonemizer
bitsandbytes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

涉及量化的删掉。。。

lora_weight = lora_weight.half()
weight_norm = self.get_weight_norm(weight, lora_weight, scaling)
if place_on_cpu:
weight_norm = weight_norm.to("cpu")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytorch代码?

self._disable_adapters = False
self.merged_adapters = []
self.use_dora: dict[str, bool] = {}
self.lora_magnitude_vector: Optional[ParameterDict] = None # for DoRA
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里为啥改了?

# initialize a the same way as the default for nn.Linear and b to zero
nn.init.zeros_(self.lora_embedding_A[adapter_name])
nn.init.normal_(self.lora_embedding_B[adapter_name])
def dora_init(self, adapter_name: str) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

函数之间要加空行

@lvyufeng lvyufeng merged commit 68c2ab4 into mindspore-lab:master Sep 12, 2024
28 of 35 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants