Skip to content

Commit c5b8b59

Browse files
authored
[Misc] Fix PhiMoE expert mapping (#21085)
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
1 parent 4fcef49 commit c5b8b59

File tree

1 file changed

+1
-6
lines changed

1 file changed

+1
-6
lines changed

vllm/model_executor/models/phimoe.py

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -533,14 +533,9 @@ def load_weights(self, weights: Iterable[tuple[str,
533533
("qkv_proj", "v_proj", "v"),
534534
]
535535

536-
expert_params_mapping = FusedMoE.make_expert_params_mapping(
537-
ckpt_gate_proj_name="w1",
538-
ckpt_down_proj_name="w2",
539-
ckpt_up_proj_name="w3",
540-
num_experts=self.config.num_local_experts)
541-
542536
params_dict = dict(self.named_parameters())
543537
loaded_params: set[str] = set()
538+
expert_params_mapping = self.get_expert_mapping()
544539
for name, loaded_weight in weights:
545540
if (self.quant_config is not None and
546541
(scale_name := self.quant_config.get_cache_scale(name))):

0 commit comments

Comments
 (0)