You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/06/13 05:31:11 UTC

[GitHub] [incubator-mxnet] RogerChern commented on issue #12369: batchnorm from scratch with autograd gives very different gradient from mx.nd.BatchNorm

RogerChern commented on issue #12369: batchnorm from scratch with autograd gives very different gradient from mx.nd.BatchNorm
URL: https://github.com/apache/incubator-mxnet/issues/12369#issuecomment-501554476
 
 
   Cool, I now get the correct result with the following snippet.
   
   ```python
   import mxnet as mx
   
   
   def batch_norm_nd(x, gamma, beta, eps=1e-5):
       mean = mx.nd.mean(x, axis=(0, 2, 3), keepdims=True)
       var = mx.nd.mean((x - mean) ** 2, axis=(0, 2, 3), keepdims=True)
       x_hat = (x - mean) / mx.nd.sqrt(var + eps)
   
       return x_hat * gamma + beta
   
   if __name__ == "__main__":
       x1 = mx.nd.random_normal(0.3, 2, shape=(2, 16, 32, 32))
       x2 = x1.copy()
       gamma = mx.nd.ones(shape=(1, 16, 1, 1))
       beta = mx.nd.zeros(shape=(1, 16, 1, 1))
       mmean = mx.nd.zeros(shape=(1, 16, 1, 1))
       mvar = mx.nd.ones(shape=(1, 16, 1, 1))
       x1.attach_grad()
       x2.attach_grad()
       gamma.attach_grad()
       beta.attach_grad()
   
       grad = mx.nd.random_normal(0, 1, shape=(2, 16, 32, 32))
       with mx.autograd.record(train_mode=True):
           y1 = batch_norm_nd(x1, gamma, beta)
       y1.backward(grad)
   
       with mx.autograd.record(train_mode=True):
           y2 = mx.nd.BatchNorm(x2, gamma, beta, mmean, mvar, fix_gamma=False, use_global_stats=False, eps=1e-5)
       y2.backward(grad)
   
       print("--------------------autograd grad scale----------------------")
       print(x1.grad[0, 1])
       print("\n\n")
   
       print("--------------------forward native/autograd----------------------")
       print((y2 / y1)[0, 1])
       print("\n\n")
   
       print("--------------------backward native/autograd----------------------")
       print((x2.grad / x1.grad)[0, 1])
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services