You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/06/16 06:47:37 UTC

[GitHub] [incubator-mxnet] kshitij12345 commented on issue #15253: Add higher order gradient support `sigmoid`, `tan`, `tanh`

kshitij12345 commented on issue #15253: Add higher order gradient support `sigmoid`, `tan`, `tanh`
URL: https://github.com/apache/incubator-mxnet/pull/15253#issuecomment-502426152
 
 
   @larroy @apeforest Please help.
   
   I am facing some issue here.
   
   https://github.com/apache/incubator-mxnet/blob/f49013a60b8ea5d6c75ba3515f25a8c346748269/src/operator/tensor/elemwise_unary_op_basic.cc#L123-L137
   
   Here I am returning `input[1] * ograd` as value for `x_grad_grad` which I expect it to be `f(x) * ograd` as we use `ElemGradUseOut` to wrap the `_backward_sigmoid` function.
   
   However from the tests.
   
   https://github.com/apache/incubator-mxnet/blob/f49013a60b8ea5d6c75ba3515f25a8c346748269/tests/python/unittest/test_higher_order_grad.py#L109-L123
   
   https://github.com/apache/incubator-mxnet/blob/f49013a60b8ea5d6c75ba3515f25a8c346748269/tests/python/unittest/test_higher_order_grad.py#L136-L151
   
   You can see that the actual value returned is `f(x) * f'(x) * ograd`.
   I have also tried the method as suggested by @larroy of using `create_graph=False` and `retain_graph=True`.
   
   Thank You.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services