Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoParallel] add release_gradients and comm_buffer_size_MB to strategy #9432

Merged
merged 5 commits into from
Nov 20, 2024

Conversation

AndSonder
Copy link
Contributor

PR types

Others

PR changes

Others

Description

add release_gradients and comm_buffer_size_MB to strategy

Waiting for:

Copy link

paddle-bot bot commented Nov 14, 2024

Thanks for your contribution!

Copy link

codecov bot commented Nov 14, 2024

Codecov Report

Attention: Patch coverage is 0% with 2 lines in your changes missing coverage. Please review.

Project coverage is 52.92%. Comparing base (79bba4f) to head (914ccaa).
Report is 1 commits behind head on develop.

Files with missing lines Patch % Lines
paddlenlp/trainer/training_args.py 0.00% 2 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #9432      +/-   ##
===========================================
+ Coverage    52.70%   52.92%   +0.21%     
===========================================
  Files          677      677              
  Lines       109391   107943    -1448     
===========================================
- Hits         57658    57130     -528     
+ Misses       51733    50813     -920     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.


🚨 Try these New Features:

Copy link
Collaborator

@wawltor wawltor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wawltor wawltor merged commit d5a90f7 into PaddlePaddle:develop Nov 20, 2024
9 of 12 checks passed
@AndSonder AndSonder deleted the add_sharding_config branch November 22, 2024 04:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants