You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/07/17 16:58:25 UTC

[GitHub] [incubator-mxnet] kshitij12345 commented on a change in pull request #15331: [fix] missing input log higher order.

kshitij12345 commented on a change in pull request #15331: [fix] missing input log higher order.
URL: https://github.com/apache/incubator-mxnet/pull/15331#discussion_r304541519
 
 

 ##########
 File path: src/operator/tensor/elemwise_unary_op_basic.cc
 ##########
 @@ -1117,15 +1117,15 @@ MXNET_OPERATOR_REGISTER_BINARY_WITH_SPARSE_CPU_DR(_backward_log10,
                                                   unary_bwd<mshadow_op::log10_grad>)
 .set_attr<nnvm::FGradient>("FGradient",
   [](const nnvm::NodePtr& n, const std::vector<nnvm::NodeEntry>& ograds) {
-    // ograds[0]: dL/dxgrad
+    // ograds[0]: dL/dygrad
 
 Review comment:
   I guess it should be, dL/dy_grad as we are computing/returning dL/dx_grad,
   Eg.
   ```
   y = f(dx_grad)
   L = g(y) # dx_grad formed part of the network and affected loss
   
   During backprop by chain rule,
   dL/dx_grad = dL/dy * dy/dx_grad
   
   In comments, we have called dL/dy (mentioned in the above example) as dL/dy_grad
   ```
   
   That is why we have,
   https://github.com/apache/incubator-mxnet/blob/5b95fb3ee3581ba20fe1def336621d68a811e17f/src/operator/tensor/elemwise_unary_op_basic.cc#L1111-L1112
   
   These multiplications performing, 
   ```
   dL/dx_grad = dL/dy * dy/dx_grad
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services