Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using allreduce_avg to eliminate scale in auto parallel DP #61622

Conversation

From00
Copy link
Contributor

@From00 From00 commented Feb 5, 2024

PR types

Performance optimization

PR changes

Others

Description

Pcard-76459
使用allreduce_avg算子替代数据并行中用于梯度同步通信的allreduce_sum + scale模式,提升通信效率,同时让所有通信操作在网络中连续,避免scale计算算子影响sharding中的通信fusion优化策略。
通过auto_parallel.strategy.gradient_scale_using_allreduce_avg参数控制开启,且仅在NCCL版本>=2.10时生效。在A800单机8卡环境下测试llama1-13B-PP2-SD4-GBS128-ACC32模型,性能提升约2%。

此PR中配套进行了以下工作:

  1. 新增allreduce_avgreduce_avg通信算子。
  2. python端新增nccl_version API,用于判断nccl版本。
  3. 对sharding pass进行问题修复和完善,包括grad_group逻辑适配、多流依赖分析逻辑适配、dist_attr补全等,使stage1和stage2及其配套优化策略均能在Llama模型上跑通。在Llama模型上,当前stage2性能优于stage1(+3%),但stage2通信fusion策略精度有误,原因是与gradient_merge梯度累加策略不适配,相关问题将在PR [Auto Parallel] Move reduce to opt stage #62157 中被解决。

PaddleNLP组网开关适配相关PR:PaddlePaddle/PaddleNLP#8021

Copy link

paddle-bot bot commented Feb 5, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

… using-allruduce-avg-to-eliminate-scale-in-auto-parallel-dp
Copy link
Contributor

@JZ-LIANG JZ-LIANG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@zhiqiu zhiqiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@heavyrain-lzy heavyrain-lzy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@sunzhongkai588 sunzhongkai588 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 改一次要重跑 CI,单独提一个文档的 PR 吧

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • API暴露给官网的话,文档写的过于简单了,请参考英文模板
  • 中文也同步补充

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

好,后续另提PR补充。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

补充文档PR
英文:#62480
中文:PaddlePaddle/docs#6515

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@From00 From00 merged commit eb93d67 into PaddlePaddle:develop Mar 5, 2024
30 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants