You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/05/20 20:41:33 UTC

[GitHub] [incubator-mxnet] apeforest edited a comment on issue #14991: Second order gradient wrt inputs, expected behaviour.

apeforest edited a comment on issue #14991: Second order gradient wrt inputs, expected behaviour.
URL: https://github.com/apache/incubator-mxnet/issues/14991#issuecomment-494119975
 
 
   Calling autograd.grad on a first order ndarray seems not working this way. The API design could have been better documented.
   The following block works.
   ```
   def test_ag_grad():
       x = mx.nd.array([1, 2, 3])
       y = mx.nd.array([2, 3, 4])
       x.attach_grad()
       y.attach_grad()
       with mx.autograd.record():
           z = nd.elemwise_add(x, y)
           x_grad = mx.autograd.grad(z, x, create_graph=True, retain_graph=True)[0]
           print(x_grad)
           #first_grad = nd.concat(*[x.reshape(-1) for x in x_grad_y_grad], dim=0)
           #fg_f = 2 * first_grad
           #second_grad = mx.autograd.grad(fg_f, [x,y], retain_graph=True)
           fg_f = 2 * x_grad
       fg_f.backward()
       print(x.grad)
   ```
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services