You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/01/19 14:31:20 UTC

[GitHub] asmushetzel opened a new issue #9497: Ensure that all operators properly support kAddTo in backward pass

asmushetzel opened a new issue #9497: Ensure that all operators properly support kAddTo in backward pass
URL: https://github.com/apache/incubator-mxnet/issues/9497
 
 
   PR9495 fixes one operation where the backward pass of an operator did not obey the "req" argument properly (resp. not at all) and so ignored the "kAddTo" directive (which directs the operator to add the gradients to the output tensor(s) and not just assign them. This leads to wrong gradients in case that an operator fans out its output to more than one other operator.
   
   We should examine all operators whether they all properly handle the "req" parameter in backward pass. There is at least one more that doesn't which is **svm_output.**
   
   As this are basic and easy to fix but hard to detect problems (they may just slightly derail the training over time) we should really prioritize this sanity checking,

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services