Skip to content

Conversation

@1092626063
Copy link
Contributor

@1092626063 1092626063 commented Nov 21, 2025

What this PR does / why we need it?

ops gatingtopk fix nightly ci error

Does this PR introduce any user-facing change?

How was this patch tested?

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request fixes a potential TypeError in _select_experts_with_fusion_ops by providing default values for topk_group and num_expert_group. My review focuses on a subtle change in behavior this introduces. While the fix is correct for the bug, it alters how softmax scoring handles expert grouping. I've left a comment with a suggestion to preserve the original behavior if the change was unintentional.

Comment on lines +175 to +211
topk_group = topk_group if topk_group is not None else 1
num_expert_group = num_expert_group if num_expert_group is not None else 1
norm_type = 0 if scoring_func == "softmax" else 1
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This change correctly provides default values for topk_group and num_expert_group, fixing a potential TypeError. However, it also alters the behavior for scoring_func="softmax". The original code always disabled grouping for softmax by forcing topk_group=1 and num_expert_group=1. This change enables grouping for softmax if grouping parameters are provided.

If this was unintentional, the suggested change below preserves the original behavior for softmax while still fixing the bug for other cases. If enabling grouping for softmax was intended, feel free to disregard this suggestion.

    if scoring_func == "softmax":
        norm_type = 0
        topk_group = 1
        num_expert_group = 1
    else:
        norm_type = 1
        topk_group = topk_group if topk_group is not None else 1
        num_expert_group = num_expert_group if num_expert_group is not None else 1

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

Signed-off-by: 1092626063 <1092626063@qq.com>
@wangxiyuan wangxiyuan added ready read for review ready-for-test start test by label for PR labels Nov 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module:ops module:tests ready read for review ready-for-test start test by label for PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants