You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/07/22 21:51:00 UTC

[GitHub] [incubator-mxnet] sxjscience opened a new issue #15627: [OP][Refactor] Merge native implementations of variance-related operators: BatchNorm, GroupNorm, LayerNorm, Moments

sxjscience opened a new issue #15627: [OP][Refactor] Merge native implementations of variance-related operators: BatchNorm, GroupNorm, LayerNorm, Moments
URL: https://github.com/apache/incubator-mxnet/issues/15627
 
 
   There are lots of operators in MXNet that are related to calculating the forward + gradient w.r.t the variance, i.e., `var(X)`
   
   - Moments
   https://github.com/apache/incubator-mxnet/blob/master/src/operator/nn/moments.cc
   - BatchNorm
   https://github.com/apache/incubator-mxnet/blob/master/src/operator/nn/batch_norm.cc
   - LayerNorm
   https://github.com/apache/incubator-mxnet/blob/master/src/operator/nn/layer_norm.cc
   - GroupNorm
   https://github.com/apache/incubator-mxnet/blob/master/src/operator/nn/group_norm.cc
   
   Some code bases can be merged to share the optimization tricks used in different places. So I propose to do some refactors.
   
   We can also refer to the implementation of ATen https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/native/Normalization.cpp
   
   @pengzhao-intel @haojin2, what do you think?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services