Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix TP + TransformerEngine issue. #1341

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Fix TP + TransformerEngine issue. #1341

wants to merge 2 commits into from

Conversation

wang2yn84
Copy link
Collaborator

@wang2yn84 wang2yn84 commented Mar 5, 2025

The data sharding applies to the data pipeline while TP applies to model parallelism. TP + TransformerEngine suffers from this and causes training hang due to much longer NCCL communication. It's verified by running llama 8B on 16 nodes and works fine.

FIXES: b/384441280

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed.

@wang2yn84 wang2yn84 requested a review from aireenmei March 6, 2025 00:21
@wang2yn84
Copy link
Collaborator Author

Adding @aireen for review

Copy link
Collaborator

@aireenmei aireenmei left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not familiar with the impact of this change, seems orthoganol to any data input. I suggest to have @gobbleturk or @khatwanimohit to take a look.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants