Skip to content

Add Eagle-3 Qwen support (follow-up to #20436) #21025

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 24 commits into from

Add: preliminary qwen support

8724f40
Select commit
Loading
Failed to load commit list.
Closed

Add Eagle-3 Qwen support (follow-up to #20436) #21025

Add: preliminary qwen support
8724f40
Select commit
Loading
Failed to load commit list.
Mergify / Summary succeeded Jul 16, 2025 in 0s

4 rules match and 15 potential rules

Rule: label-documentation (label)

  • any of:
    • files~=^[^/]+\.md$
    • files~=^docs/
    • files~=^examples/

Rule: label-ci-build (label)

  • any of:
    • files=CMakeLists.txt
    • files=setup.py
    • files~=\.buildkite/
    • files~=^\.github/
    • files~=^cmake/
    • files~=^docker/Dockerfile
    • files~=^requirements.*\.txt

Rule: label-deepseek (label)

  • any of:
    • files~=^examples/.*deepseek.*\.py
    • files~=^tests/.*deepseek.*\.py
    • files~=^vllm/entrypoints/openai/tool_parsers/.*deepseek.*\.py
    • files~=^vllm/model_executor/models/.*deepseek.*\.py
    • files~=^vllm/reasoning/.*deepseek.*\.py
    • files~=^vllm/transformers_utils/.*deepseek.*\.py
    • title~=(?i)DeepSeek

Rule: label-frontend (label)

  • files~=^vllm/entrypoints/

✅ Rule: label-llama (label)

  • any of:
    • files~=^vllm/model_executor/models/.*llama.*\.py
    • files~=^examples/.*llama.*\.py
    • files~=^tests/.*llama.*\.py
    • files~=^vllm/entrypoints/openai/tool_parsers/llama.*\.py
    • files~=^vllm/transformers_utils/configs/.*llama.*\.py
    • title~=(?i)llama

Rule: label-multi-modality (label)

  • any of:
    • files=tests/models/test_vision.py
    • files~=^tests/models/multimodal/
    • files~=^tests/multimodal/
    • files~=^vllm/multimodal/

Rule: label-new-model (label)

  • all of:
    • files=vllm/model_executor/models/registry.py
    • files~=^vllm/model_executor/models/

Rule: label-performance (label)

  • any of:
    • files~=^\.buildkite/nightly-benchmarks/
    • files~=^benchmarks/
    • files~=^tests/benchmarks/
    • files~=^vllm/benchmarks/

✅ Rule: label-qwen (label)

  • any of:
    • files~=^vllm/model_executor/models/.*qwen.*\.py
    • title~=(?i)Qwen
    • files~=^examples/.*qwen.*\.py
    • files~=^tests/.*qwen.*\.py
    • files~=^vllm/reasoning/.*qwen.*\.py

Rule: label-rocm (label)

  • any of:
    • files=vllm/platforms/rocm.py
    • files~=^csrc/rocm/
    • files~=^docker/Dockerfile.rocm
    • files~=^requirements/rocm.*\.txt
    • files~=^tests/kernels/.*_rocm.*\.py
    • files~=^vllm/attention/backends/rocm.*\.py
    • files~=^vllm/attention/ops/rocm.*\.py
    • files~=^vllm/model_executor/layers/fused_moe/rocm.*\.py
    • files~=^vllm/v1/attention/backends/mla/rocm.*\.py
    • title~=(?i)AMD
    • title~=(?i)ROCm

Rule: label-structured-output (label)

  • any of:
    • files=benchmarks/benchmark_serving_structured_output.py
    • files=benchmarks/run_structured_output_benchmark.sh
    • files=docs/features/structured_outputs.md
    • files=examples/offline_inference/structured_outputs.py
    • files=examples/online_serving/openai_chat_completion_structured_outputs.py
    • files=examples/online_serving/openai_chat_completion_structured_outputs_with_reasoning.py
    • files=tests/entrypoints/llm/test_guided_generate.py
    • files=tests/model_executor/test_guided_processors.py
    • files=tests/v1/entrypoints/llm/test_guided_generate.py
    • files~=^benchmarks/structured_schemas/
    • files~=^tests/v1/structured_output/
    • files~=^vllm/model_executor/guided_decoding/
    • files~=^vllm/v1/structured_output/

✅ Rule: label-speculative-decoding (label)

  • any of:
    • files~=^vllm/model_executor/models/.*eagle.*\.py
    • files=vllm/model_executor/layers/spec_decode_base_sampler.py
    • files=vllm/model_executor/models/mlp_speculator.py
    • files~=^examples/.*(spec_decode|mlpspeculator|eagle|speculation).*\.py
    • files~=^tests/spec_decode/
    • files~=^tests/v1/spec_decode/
    • files~=^vllm/spec_decode/
    • files~=^vllm/transformers_utils/configs/(eagle|medusa|mlp_speculator)\.py
    • files~=^vllm/v1/spec_decode/

Rule: label-v1 (label)

  • any of:
    • files~=^tests/v1/
    • files~=^vllm/v1/

Rule: label-tpu (label)

  • any of:
    • files~=/tpu/
    • files~=_tpu
    • files~=pallas
    • files~=tpu.py
    • files~=tpu_

✅ Rule: label-tpu-remove (label)

  • all of:
    • -files~=/tpu/
    • -files~=_tpu
    • -files~=pallas
    • -files~=tpu.py
    • -files~=tpu_

Rule: label-tool-calling (label)

  • any of:
    • files=docs/features/tool_calling.md
    • files=examples/offline_inference/chat_with_tools.py
    • files=examples/online_serving/openai_chat_completion_client_with_tools.py
    • files=examples/online_serving/openai_chat_completion_client_with_tools_required.py
    • files=examples/online_serving/openai_chat_completion_tool_calls_with_reasoning.py
    • files=tests/entrypoints/openai/test_chat_with_tool_reasoning.py
    • files~=^examples/tool_chat_*
    • files~=^tests/entrypoints/openai/tool_parsers/
    • files~=^tests/mistral_tool_use/
    • files~=^tests/tool_use/
    • files~=^vllm/entrypoints/openai/tool_parsers/

Rule: ping author on conflicts and add 'needs-rebase' label (comment, label)

  • -closed
  • conflict

Rule: assign reviewer for tensorizer changes (assign)

  • files~=^tests/entrypoints/openai/test_tensorizer_entrypoint.py
  • files~=^tests/tensorizer_loader/
  • files~=^vllm/model_executor/model_loader/tensorizer.py
  • files~=^vllm/model_executor/model_loader/tensorizer_loader.py

Rule: remove 'needs-rebase' label when conflict is resolved (label)

  • -closed
  • -conflict

💖  Mergify is proud to provide this service for free to open source projects.

🚀  You can help us by becoming a sponsor!


Mergify commands and options

More conditions and actions can be found in the documentation.

You can also trigger Mergify actions by commenting on this pull request:

  • @Mergifyio refresh will re-evaluate the rules
  • @Mergifyio rebase will rebase this PR on its base branch
  • @Mergifyio update will merge the base branch into this PR
  • @Mergifyio backport <destination> will backport this PR on <destination> branch

Additionally, on Mergify dashboard you can:

  • look at your merge queues
  • generate the Mergify configuration with the config editor.

Finally, you can contact us on https://mergify.com