-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MLU] Fix Llama attrntion_mask in npu and mlu #9075
[MLU] Fix Llama attrntion_mask in npu and mlu #9075
Conversation
Thanks for your contribution! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #9075 +/- ##
===========================================
- Coverage 53.56% 53.51% -0.05%
===========================================
Files 652 652
Lines 106397 105187 -1210
===========================================
- Hits 56987 56293 -694
+ Misses 49410 48894 -516 ☔ View full report in Codecov by Sentry. |
@@ -1653,7 +1653,7 @@ def forward( | |||
is_casual = True | |||
else: | |||
is_casual = is_casual_mask(attention_mask) | |||
if get_env_device() != "npu" or get_env_device() != "mlu": | |||
if get_env_device() != "npu" and get_env_device() != "mlu": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if get_env_device() != "npu" and get_env_device() != "mlu": | |
if get_env_device() not in ["npu", "mlu"]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这样会不会好一点
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
嗯嗯,明确一些
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Bug fixes
PR changes
Others
Description
Fix Llama attrntion_mask in npu and mlu.