Skip to content

Commit ef27370

Browse files
committed
add mc2 mask
Signed-off-by: weiguihua2 <weiguihua2@huawei.com>
1 parent c9bbd0c commit ef27370

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

vllm_ascend/ops/fused_moe.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1179,6 +1179,9 @@ def forward(self,
11791179
hidden_states, (0, 0, 0, tp_size - num_tokens))
11801180
router_logits = nn.functional.pad(
11811181
router_logits, (0, 0, 0, tp_size - num_tokens))
1182+
if mc2_mask is not None:
1183+
mc2_mask = nn.functional.pad(
1184+
mc2_mask, (0, tp_size - num_tokens))
11821185
chunk_hidden_states = torch.tensor_split(hidden_states,
11831186
tp_size,
11841187
dim=0)

0 commit comments

Comments
 (0)