You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/04/22 16:28:58 UTC

[GitHub] [tvm] mikeseven opened a new pull request #7912: Pytorch convtranspose padding fix

mikeseven opened a new pull request #7912:
URL: https://github.com/apache/tvm/pull/7912


   Pytorch Conv Transpose Padding Fix
   ==================================
   
   Background
   ----------
   
   *   we noticed a discrepancy between the output shapes produced by Pytorch and TVM for a Pytorch network containing a single ``torch.nn.ConvTranspose2d`` operator.
   *   When comparing the attributes of the ``torch.nn.ConvTranspose2d`` operator and the ``tvm.relay.nn.conv2d_transpose``
       operator, the output_padding parameter in ``tvm.relay.nn.conv2d_transpose`` would always default to
       0 regardless of what output padding was set in ``torch.nn.ConvTranspose2d``.
   *   Upon further inspection, it was found that in ``tvm/python/tvm/relay/frontend/pytorch.py``, the import logic for convolution layers was missing the output_padding parameter.
   
   The Fix
   -------
   
   *   All fixes were implemented in ``tvm/relay/frontend/pytorch.py``.
   *   To resolve the missing padding parameter, ``convolution`` method of the
       ``PyTorchOpConverter`` class is updated so that when it constructed the relay convolution op it supplied the ``output_padding`` attribute in the cases where it was creating convolution transpose operations.
   *   Over the course of the fix I also discovered that the convolution class automatically converted
       ``torch.nn.ConvTranspose1D`` operations into ``tvm.relay.nn.conv2d_transpose``. This was fixed so now they were
       converted into ``tvm.relay.nn.conv1d_transpose`` operations.
   *   Over the couse of the fix we also discovered that ``torch.nn.Conv1d`` operations were being converted into
       ``tvm.relay.nn.conv2d`` operations. This was fixed so that they are now converted into ``tvm.relay.nn.conv1d``
       operations. There is a slight caveat where because tvm does not support grouped 1D convolution as stated in the
       description of ``tvm.relay.nn.conv1d``, in that case we convert the operation to 2D convolution which does have
       support for grouped convolution. After the 2D convolution, we then squeeze the output to get the correct shape and
       values for a grouped 1D convolution.
   
   Test Coverage
   -------------
   
   *   extended the ``test_forward_conv_transpose`` test in ``tvm/tests/python/frontend/pytorch/test_forward.py``. 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] mikeseven commented on pull request #7912: Pytorch convtranspose padding fix

Posted by GitBox <gi...@apache.org>.
mikeseven commented on pull request #7912:
URL: https://github.com/apache/tvm/pull/7912#issuecomment-825050235


   updated with missing files


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] masahi closed pull request #7912: Pytorch convtranspose padding fix

Posted by GitBox <gi...@apache.org>.
masahi closed pull request #7912:
URL: https://github.com/apache/tvm/pull/7912


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org