You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/05/09 16:56:59 UTC
[GitHub] eric-haibin-lin opened a new issue #10868: _backward_softsign
activation is incorrect
eric-haibin-lin opened a new issue #10868: _backward_softsign activation is incorrect
URL: https://github.com/apache/incubator-mxnet/issues/10868
The following test case will fail:
Adding modify the test case in https://github.com/apache/incubator-mxnet/blob/master/tests/python/unittest/test_operator.py#L5867
```
def test_activation():
shape=(9, 10)
dtype_l = [np.float64, np.float32, np.float16]
rtol_l = [1e-7, 1e-6, 1e-2]
atol_l = [1e-7, 1e-6, 1e-2]
rtol_fd = 1e-5
atol_fd = 1e-6
num_eps = 1e-6
unary_ops = {
'softsign': [lambda x: mx.sym.Activation(x, act_type='softsign'),
lambda x: x / (1. + np.abs(x)),
lambda x: 1. / np.square(1. + np.abs(x)),
}
# Loop over operators
for name, op in unary_ops.items():
# Loop over dtype's
for ind in range(len(dtype_l)):
dtype = dtype_l[ind]
rtol = rtol_l[ind]
atol = atol_l[ind]
compare_forw_backw_unary_op(
name, op[0], op[1], op[2], shape, op[3], op[4], rtol, atol,
dtype)
# Finite difference testing
finite_diff_unary_op(
name, op[0], shape, op[3], op[4], rtol_fd, atol_fd, num_eps)
```
Reason: For y = softsign(x), the inputs for _backward_softsign and _backward_Activation are different:
_backward_softsign takes (dy, x) as input, backward_Activation takes (dy, y) as input.
@nswamy
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services