Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Qwen/Qwen1.5-MoE-A2.7B-Chat模型启动报错 #176

Closed
Luo-Jinyan opened this issue Feb 26, 2025 · 1 comment · Fixed by #179
Closed

[Bug]: Qwen/Qwen1.5-MoE-A2.7B-Chat模型启动报错 #176

Luo-Jinyan opened this issue Feb 26, 2025 · 1 comment · Fixed by #179
Labels
bug Something isn't working

Comments

@Luo-Jinyan
Copy link

Luo-Jinyan commented Feb 26, 2025

Your current environment

npu910B && vllm-ascend0.7.1rc1
Your output of above commands here

🐛 Describe the bug

报错信息

 File "vllm_ascend/ops/fused_moe.py", line 152, in forward_oot
    topk_weights, topk_ids = group_topk(
  File "vllm_ascend/ops/fused_moe.py", line 49, in group_topk
    torch_npu.npu_group_topk(input=scores, out=scores, group_num=num_expert_group, k=topk_group)
  File "venv/lib/python3.10/site-packages/torch/_ops.py", line 1116, in __call__
    return self._op(*args, **(kwargs or {}))
RuntimeError: npu::npu_group_topk() Expected a value of type 'int' for argument 'k' but instead found type 'NoneType'.

这个报错应该是因为在调用group_topk送了topk_group=None导致的,不确定这里是对支持的MoE模型有要求,还是其他原因.

@Luo-Jinyan Luo-Jinyan added the bug Something isn't working label Feb 26, 2025
@MengqingCao
Copy link
Contributor

MengqingCao commented Feb 26, 2025

Thanks for this bug report, we indeed need a arg transfer if it is None.

However we still need to wait for the next CANN version to support MOE, including qwen moe and deepseek v3, etc.

FYI, #72 (comment)

MengqingCao added a commit to MengqingCao/vllm-ascend-fork that referenced this issue Feb 26, 2025
Signed-off-by: MengqingCao <[email protected]>
MengqingCao added a commit to MengqingCao/vllm-ascend-fork that referenced this issue Feb 26, 2025
Signed-off-by: MengqingCao <[email protected]>
wangxiyuan pushed a commit that referenced this issue Feb 27, 2025
Fix #176
We need to set `topk_group` and `num_expert_group` to `0` if they are
`None`

Signed-off-by: MengqingCao <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants