You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/03/06 21:11:08 UTC
[GitHub] sxjscience commented on issue #10002: General support of OPs for second-order gradient
sxjscience commented on issue #10002: General support of OPs for second-order gradient
URL: https://github.com/apache/incubator-mxnet/issues/10002#issuecomment-370929547
@lightingghost Borrow the discussion in https://github.com/apache/incubator-mxnet/issues/9979 here
```python
import mxnet.ndarray as nd
from mxnet import autograd
x = nd.array([3.0])
x.attach_grad()
with autograd.record():
y = x**2
y_grad = autograd.grad(y, x, create_graph=True, retain_graph=True)[0]
z = y_grad ** 2
z.backward()
print(z.grad)
```
```log
MXNetError: [12:44:29] src/pass/gradient.cc:187: Operator _backward_power_scalar is non-differentiable because it didn't register FGradient attribute.
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services