You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/06/05 17:04:23 UTC

[GitHub] [incubator-mxnet] apeforest edited a comment on issue #15120: [bug] fix higher grad log

apeforest edited a comment on issue #15120: [bug] fix higher grad log 
URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499170409
 
 
   @kshitij12345 I have some question about the equation `expected_head_grad = (grad_op(x) * head_grad_grads).asnumpy()` in your test.
   
   My understanding from the chain rule:
   
   ```
   Given y =f(x)
   dL/dx = dL/dy * dy/dx -->  this is the first forward pass. Let dL/dy be y_grad, we get dL/dx (noted as x_grad)
   
   Now we rewrite the above the equation:
   
   input0: y_grad
   input1: x
   output: x_grad = y_grad * f'(x)
   
   Another backward pass for this would be:
   dL/d y_grad = dL/d x_grad * f'(x)
   dL/dx = dL/d x_grad * y_grad * f''(x)
   
   What is the meaning of dL/d y_grad? Are we treating y_grad as another input variable here?
   ```
   
   Thanks for your clarification.
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services