You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/04/07 21:12:35 UTC

[GitHub] [incubator-mxnet] zhreshold commented on issue #17989: [Autograd] Very serious bug of grad_req='add'

zhreshold commented on issue #17989: [Autograd] Very serious bug of grad_req='add'
URL: https://github.com/apache/incubator-mxnet/issues/17989#issuecomment-610623359
 
 
   I don't think it's related to particular op implementation, it's something may not be working at all when autograd is introduced. 
   See the experiments I did: https://github.com/apache/incubator-mxnet/issues/16708#issuecomment-558876214
   
   What I did is replicate the same node N times, if N is in (1, 2, 3, 4, 8, 9, 10...) times, the loss and gradients are always GOOD, however, with (5, 6, 7), the gradients will diverge at the first iteration

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services