Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[XPU] Support several ops on precision of fp16. #10025

Merged
merged 1 commit into from
Feb 28, 2023

Conversation

stevenshen36
Copy link
Contributor

PR devices

XPU

PR types

New features

PR changes

OP

Description

Add silu/sin/cos/slice ops for fp16 precision on XPU backend.

@paddle-bot
Copy link

paddle-bot bot commented Feb 23, 2023

Thanks for your contribution!

lite/kernels/x86/calib_compute.cc Outdated Show resolved Hide resolved
lite/kernels/xpu/activation_compute.cc Outdated Show resolved Hide resolved
lite/kernels/xpu/cos_compute.cc Outdated Show resolved Hide resolved
@stevenshen36 stevenshen36 force-pushed the fp16_op branch 2 times, most recently from 3393166 to ef102db Compare February 23, 2023 12:19
Copy link
Collaborator

@zhupengyang zhupengyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

新增算子至少要有fp32的单测

@stevenshen36 stevenshen36 force-pushed the fp16_op branch 2 times, most recently from 04890ed to ec4217d Compare February 25, 2023 04:35
@stevenshen36
Copy link
Contributor Author

已添加相关新算子的测例,并解决float16类型未定义的编译报错问题。@zhupengyang

Copy link
Collaborator

@zhupengyang zhupengyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

剩下的小问题后面的PR改下

Comment on lines +116 to +119
using xpu_calib_fp32_to_fp16_kfp16 =
paddle::lite::kernels::xpu::CalibCompute<float, float16, PRECISION(kFP16)>;
using xpu_calib_fp16_to_fp32_kfp16 =
paddle::lite::kernels::xpu::CalibCompute<float16, float, PRECISION(kFP16)>;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

命名改下
xpu_calib_fp32_to_fp16
xpu_calib_fp16_to_fp32

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个等业务侧模型验证之后再改吧。

Comment on lines +695 to +698
#elif defined(LITE_WITH_XPU)
place = TARGET(kXPU);
alias = "silu_fp32";
abs_error = 2e-4;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

放到 LITE_WITH_ARM 前面一行

Comment on lines +80 to +82
#elif defined(LITE_WITH_XPU)
Place place(TARGET(kXPU), PRECISION(kFloat));
test_sin(place);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

Comment on lines +84 to +86
#elif defined(LITE_WITH_XPU)
Place place(TARGET(kXPU), PRECISION(kFloat));
test_sin(place);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

@zhupengyang zhupengyang merged commit cbfe3c3 into PaddlePaddle:develop Feb 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants