You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/11/20 03:26:14 UTC

[GitHub] stu1130 commented on a change in pull request #13328: adding test for mkldnn softmax operator for large negative inputs

stu1130 commented on a change in pull request #13328: adding test for mkldnn softmax operator for large negative inputs
URL: https://github.com/apache/incubator-mxnet/pull/13328#discussion_r234858561
 
 

 ##########
 File path: tests/python/mkl/test_mkldnn.py
 ##########
 @@ -383,6 +383,17 @@ def check_fullyconnected_training(stype):
     for stype in stypes:
         check_fullyconnected_training(stype)
 
+def test_softmax_with_large_negative_inputs():
+    input_data = mx.nd.array([[[[-1e30,-1e30]]]])
+    data = mx.sym.Variable('data')
+    out1 = data.softmax(axis=1)
+    exec1 = out1.bind(mx.cpu(), args={'data': input_data, 'softmax_label': mx.nd.ones([1]),
+                                      'fc_weight': mx.nd.ones([2,2]), 'fc1_weight': mx.nd.ones([2,2])})
+    exec1.forward()[0].wait_to_read()
+    ndarr = exec1.outputs[0][0][0][0]
+    nparr = ndarr.asnumpy()
+    assert(np.isnan(nparr).any(), False)
 
 Review comment:
   could we assert the real output value instead of nan to make the test more robust?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services