-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytorch Conv Transpose Padding Fix #7958
Pytorch Conv Transpose Padding Fix #7958
Conversation
# Conflicts: # python/tvm/relay/frontend/tensorflow.py
…-conv-transpose-pad-fix
…rameter for conv transpose operations * updating pytorch converter to correctly convert conv1d to conv1d in tvm inestead of a flattened conv2d unless under circumstances of grouped convolution * updating pytorch converter to correctly convert conv1d transpose to conv1d transpose in tvm instead of a flattened conv2d transpose * added tests to cover these latest additions
…-conv-transpose-pad-fix
…-conv-transpose-pad-fix
…he#34) SYSOL-584 Pytorch Conv Transpose Padding Fix Approved-by: Alicja Kwasniewska Approved-by: Mikael Sevenier
Link to discussion: https://discuss.tvm.apache.org/t/pytorch-conv-transpose-padding-fix/9873 |
Thanks, can we close #7912? |
Yes. Thanks.
…On Sat, May 1, 2021 at 3:47 PM masahi ***@***.***> wrote:
Thanks, can we close #7912 <#7912>?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#7958 (comment)>, or
unsubscribe
</~https://github.com/notifications/unsubscribe-auth/ASRV54QZSFXTILOBJQ72V43TLSAIPANCNFSM435V3INQ>
.
|
…anspose-padding-fix # Conflicts: # tests/python/frontend/pytorch/test_forward.py
@masahi How can we pass the tvm-ci/pr-merge step? Looking at the details the error comes up as:
I do not believe the changes in this PR cause this issue. |
I think just retriggering ci will sometimes fix the problem. |
@Jeffrey-Sima sorry missed your last comment, yes please try rebase and retrigger ci |
…anspose-padding-fix
thanks @Jeffrey-Sima |
* fix conv transpose import from TF * fix String::fromwe() to String::from() * * fixing pytorch converter to take into account the output_padding parameter for conv transpose operations * updating pytorch converter to correctly convert conv1d to conv1d in tvm inestead of a flattened conv2d unless under circumstances of grouped convolution * updating pytorch converter to correctly convert conv1d transpose to conv1d transpose in tvm instead of a flattened conv2d transpose * added tests to cover these latest additions * * removing print statements used for debugging * * fixing typos and formatting * * fixing formatting * * fixing grammar * * formatting fixes * * updated formatting after running pylint and python_format checks Co-authored-by: Mikael Sevenier <mikael.sevenier@sima.ai>
* fix conv transpose import from TF * fix String::fromwe() to String::from() * * fixing pytorch converter to take into account the output_padding parameter for conv transpose operations * updating pytorch converter to correctly convert conv1d to conv1d in tvm inestead of a flattened conv2d unless under circumstances of grouped convolution * updating pytorch converter to correctly convert conv1d transpose to conv1d transpose in tvm instead of a flattened conv2d transpose * added tests to cover these latest additions * * removing print statements used for debugging * * fixing typos and formatting * * fixing formatting * * fixing grammar * * formatting fixes * * updated formatting after running pylint and python_format checks Co-authored-by: Mikael Sevenier <mikael.sevenier@sima.ai>
Background
torch.nn.ConvTranspose2d
operator.torch.nn.ConvTranspose2d
operator and thetvm.relay.nn.conv2d_transpose
operator, the output_padding parameter in
tvm.relay.nn.conv2d_transpose
would always default to0 regardless of what output padding was set in
torch.nn.ConvTranspose2d
.tvm/python/tvm/relay/frontend/pytorch.py
, the import logic for convolution layers was missing the output_padding parameter.The Fix
tvm/relay/frontend/pytorch.py
.PyTorchOpConverter class is updated so that when it constructed the relay convolution op it supplied the output_padding attribute in the cases where it was creating convolution transpose operations.
torch.nn.ConvTranspose1D
operations intotvm.relay.nn.conv2d_transpose
. This was fixed so now they wereconverted into
tvm.relay.nn.conv1d_transpose
operations.torch.nn.Conv1d
operations were being converted intotvm.relay.nn.conv2d
operations. This was fixed so that they are now converted into tvm.relay.nn.conv1d
operations. There is a slight caveat where because tvm does not support grouped 1D convolution as stated in the
description of
tvm.relay.nn.conv1d
, in that case we convert the operation to 2D convolution which does havesupport for grouped convolution. After the 2D convolution, we then squeeze the output to get the correct shape and
values for a grouped 1D convolution.
Test Coverage
test_forward_conv_transpose
test intvm/tests/python/frontend/pytorch/test_forward.py
.