You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/12/05 23:25:16 UTC

[GitHub] DickJC123 commented on issue #13362: Add NHWC layout support to Pooling (cuDNN only)

DickJC123 commented on issue #13362: Add NHWC layout support to Pooling (cuDNN only)
URL: https://github.com/apache/incubator-mxnet/pull/13362#issuecomment-444690325
 
 
   @TaoLv , providing details and motivation of the PR (partial duplicate of info just added by @ptrendx ):
   
   In MXNet, layout is not something that is stored with the NDArray.  Some operators, like pointwise ones, don't even care about the layout, and will produce the same output regardless of layout.  Other operators, like Convolution, Batchnorm and Pooling will need to be told the layout.  Convolution supports a limited number of layouts via the 'layout' parameter, e.g. layout='NHWC'.  Batchnorm doesn't need to know everything about the layout, just which dimension is the 'C' dimension.  For this, the Batchnorm op accepts the axis parameter, e.g. axis=3 for NHWC batchnorm.  Prior to this PR, in MXNet, the Pooling operator did not have a parameter to specify the layout, forcing a transposition always to NCHW.
   
   We have two goals with this PR:
       - Create a way to inform Pooling of the layout, in the style of the Convolution 'layout' parameter, thereby allowing direct use of the arbitrary-layout Pooling support offered by cudnn, and
       - Enable MXNet to support an end-to-end processing of NHWC-layout data (i.e. no transposes), which is particularly efficient in mixed-precision on Volta Tensor Cores.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services