You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/07/02 04:48:05 UTC
[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978]
Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507516653
I verified the result is the same as pytorch
```
import torch
import numpy as np
import math
op = lambda x: torch.sigmoid(x)
grad_op = lambda x: op(x) * (1 - op(x))
grad_grad_op = lambda x: grad_op(x) * (1 - 2 * op(x))
grad_grad_grad_op = lambda x: grad_grad_op(x) - 2 * ( grad_op(x)**2 + grad_grad_op(x) * op(x))
x = torch.tensor(np.array([1, 2, 3]), dtype=torch.float32)
head_grads = torch.tensor(np.array([1, 1, 1]), dtype=torch.float32) * 0.5
head_grad_grads = torch.tensor(np.array([1, 1, 1]), dtype=torch.float32) * 0.6
head_grad_grad_grads = torch.tensor(np.array([1, 1, 1]), dtype=torch.float32) * 0.7
x.requires_grad = True
head_grads.requires_grad = True
y = op(x)
x_grad = torch.autograd.grad(y, x, grad_outputs= head_grads, create_graph=True, retain_graph=True)[0]
expected_grad_x = (grad_op(x) * head_grads).detach().numpy()
print('expected_grad_x = {}'.format(expected_grad_x))
print('grad_x = {}'.format(x_grad.detach().numpy()))
x_grad_grad = torch.autograd.grad(x_grad, x, grad_outputs= head_grad_grads, create_graph=True, retain_graph=True)[0]
x_grad_grad.backward(head_grad_grad_grads)
expected_grad_grad_x = (grad_grad_op(x) * head_grads * head_grad_grads).detach().numpy()
expected_head_grad = (grad_op(x) * head_grad_grads).detach().numpy()
expected_grad_grad_grad_x = (grad_grad_grad_op(x) * head_grads * head_grad_grads * head_grad_grad_grads).detach().numpy()
print('expected_grad_grad_x = {}'.format(expected_grad_grad_x))
print('grad_grad_x = {}'.format(x_grad_grad.detach().numpy()))
print('expected_grad_grad_grad_x = {}'.format(expected_grad_grad_grad_x))
print('grad_grad_grad_x = {}'.format(x.grad.detach().numpy()))
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services