You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/03/04 21:09:58 UTC

[GitHub] [incubator-mxnet] anirudhacharya commented on issue #14301: SoftmaxOutput crashes with normalization "valid"

anirudhacharya commented on issue #14301: SoftmaxOutput crashes with normalization "valid"
URL: https://github.com/apache/incubator-mxnet/issues/14301#issuecomment-469422711
 
 
   I can also confirm this issue, it happens only when `normalization-"valid"` and while executing the `Executor.backward` function call. For instance this sample code works fine - 
   
   ```
   import mxnet as mx                                                                                                                                            
   import numpy as np
   
   xpu = mx.cpu()
   x_nd = mx.nd.array([[1, 6, 4, 2],[1, 6, 4, 2]], ctx=xpu)    
   grad_x = mx.nd.zeros((2,4), ctx=xpu)    
   label_nd = mx.nd.array([1,1], ctx=xpu)
   
   x_nd.attach_grad()
   
   with mx.autograd.record():
       y = mx.nd.SoftmaxOutput(data=x_nd, label=label_nd, ignore_label=0, use_ignore=True) #, normalization="valid")
   
   y.backward()
   print(x_nd.grad)
   ```
   
   So the bug is with the gradient calculation of softmax output when `normalization="valid"`

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services