Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multi_precision for adagrad op #50078

Merged
merged 3 commits into from
Mar 3, 2023

Conversation

AnnaTrainingG
Copy link
Contributor

@AnnaTrainingG AnnaTrainingG commented Jan 30, 2023

PR types

New features

PR changes

APIs

Describe

给 adagrad api 新增multi_precision 参数用于AMP O2训练

Copy link
Contributor

@jiweibo jiweibo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for dispensable inout and default attr

@AnnaTrainingG AnnaTrainingG merged commit 4779c2c into PaddlePaddle:develop Mar 3, 2023
MT param_out_data =
in - (lr_data * grad_data) / (sqrt(moment_out_data) + epsilon);

param_out[i] = static_cast<MT>(param_out_data);
Copy link
Contributor

@liudongxue01 liudongxue01 Mar 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里是不是应该改成 param_out[i] = static_cast< T >(param_out_data);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#51790 修复PR 已提交

def _create_accumulators(self, block, parameters):
assert isinstance(block, framework.Block)

if isinstance(parameters, dict):
parameters = self._update_param_group(parameters)

for p in parameters:
if self._multi_precision and p.dtype == core.VarDesc.VarType.FP16:
master_p = self._create_master_weight(p)
self._add_accumulator(self._moment_acc_str, master_p)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

此处是不是缺少fill_value=self.initial_accumulator_value?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#51790 修复PR已经提交

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants