Skip to content

[perf]Support MOE Multi-stream in Deepseek #947

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Jun 5, 2025

Conversation

David9857
Copy link
Contributor

@David9857 David9857 commented May 24, 2025

What this PR does / why we need it?

Support MOE inner Multi-stream for Deepseek.
This feature requires graph mode with mc2 enabled.

Does this PR introduce any user-facing change?

How was this patch tested?

global_bs = 0
moe_expert_num = len(expert_map)
# hidden_states = hidden_states.bfloat16()
kwargs = {
kwargs1 = {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rename to a readable name

@@ -36,6 +36,8 @@
lambda: bool(int(os.getenv("COMPILE_CUSTOM_KERNELS", "1"))),
"VLLM_ENABLE_MC2":
lambda: bool(int(os.getenv("VLLM_ENABLE_MC2", '0'))),
"VLLM_ENABLE_CV_PARALLEL":
Copy link
Collaborator

@wangxiyuan wangxiyuan May 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use additional_config instead of env, since this change is only used for torchair GE mode. like #839 does, there are another 3 new config option coming.

how about

{
   "additional_config": {
      "torchair_graph_config": {
         "enable": True,
         "enable_cv_parallet": True,
         "batch_sizes": "12345",
         "batch_sizes_init": True
      } 
  }
}

cc @zzzzwwjj

@wangxiyuan
Copy link
Collaborator

wangxiyuan commented May 29, 2025

David9857 added 2 commits June 4, 2025 11:46
Signed-off-by: David9857 <985700846@qq.com>

use additional_config to enable cv parallel

Signed-off-by: David9857 <985700846@qq.com>

rename kwargs1 in fused_experts_with_mc2

Signed-off-by: David9857 <985700846@qq.com>
Signed-off-by: David9857 <985700846@qq.com>
Copy link

github-actions bot commented Jun 4, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Copy link

github-actions bot commented Jun 5, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

@@ -179,6 +180,12 @@ def __init__(
else:
self.gate.e_score_correction_bias = None

self.enable_cv_parallel = False
additional_config = get_current_vllm_config().additional_config
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please use ascend_config instead now. Note that doc should be updated at the same time.

Signed-off-by: David9857 <985700846@qq.com>
Signed-off-by: David9857 <985700846@qq.com>
Copy link

github-actions bot commented Jun 5, 2025

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Signed-off-by: David9857 <985700846@qq.com>

bugfix

Signed-off-by: David9857 <985700846@qq.com>
Signed-off-by: David9857 <985700846@qq.com>
@wangxiyuan wangxiyuan merged commit 78431b3 into vllm-project:main Jun 5, 2025
23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants