You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/12/17 20:58:18 UTC

[GitHub] [incubator-mxnet] waytrue17 opened a new pull request #19691: [WIP] ONNX fix softmax

waytrue17 opened a new pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691


   ## Description ##
   Fix onnx softmax to support `use_length`.
   Add test case. 
   
   ## Checklist ##
   ### Essentials ###
   - [ ] PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage
   - [ ] Code is well-documented
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be made.
   - Interesting edge cases to note here
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] waytrue17 commented on a change in pull request #19691: [WIP] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
waytrue17 commented on a change in pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#discussion_r545408064



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -205,3 +206,15 @@ def test_onnx_export_fully_connected(tmp_path, dtype, num_hidden, no_bias, flatt
     if not no_bias:
         args.append(mx.nd.random.uniform(0,1,(num_hidden,)))
     op_export_test('FullyConnected', M, args, tmp_path)
+
+@pytest.mark.parametrize('dtype', ['float32', 'float64'])
+def test_onnx_export_softmax(tmp_path, dtype):
+    x = mx.nd.random.uniform(0, 1, (2, 3, 4), dtype=dtype)
+    M1 = def_model('softmax')
+    op_export_test('softmax_1', M1, [x], tmp_path)
+    M2 = def_model('softmax', use_length=True, axis=0)
+    l2 = mx.nd.array([[2,0,3,1],[1,3,2,0], [0,0,0,1]], dtype=int)

Review comment:
       `l2` and `l3` are intentionally generated to test certain cases e.g. [0,0,0,1] and [0,0,0]. I think it should be fine as `x` is randomly generated.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #19691: [WIP] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#issuecomment-747697904


   Hey @waytrue17 , Thanks for submitting the PR 
   All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands: 
   - To trigger all jobs: @mxnet-bot run ci [all] 
   - To trigger specific jobs: @mxnet-bot run ci [job1, job2] 
   *** 
   **CI supported jobs**: [website, windows-cpu, clang, miscellaneous, unix-gpu, centos-gpu, edge, unix-cpu, centos-cpu, sanity, windows-gpu]
   *** 
   _Note_: 
    Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin. 
   All CI tests must pass before the PR can be merged. 
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 merged pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
Zha0q1 merged pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] waytrue17 commented on pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
waytrue17 commented on pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#issuecomment-748526235


   @mxnet-bot run ci [unix-gpu, unix-gpu-cu110]


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] waytrue17 commented on a change in pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
waytrue17 commented on a change in pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#discussion_r545532939



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -253,3 +254,16 @@ def test_onnx_export_dropout(tmp_path, dtype, p):
     M = def_model('Dropout', p=p)
     x = mx.nd.array([[3,0.5,-0.5,2,7],[2,-0.4,7,3,0.2]], dtype=dtype)
     op_export_test('Dropout', M, [x], tmp_path)
+
+@pytest.mark.parametrize('dtype', ['float16', 'float32'])
+def test_onnx_export_softmax(tmp_path, dtype):
+    x = mx.nd.random.uniform(0, 1, (2, 3, 4), dtype=dtype)

Review comment:
       added more tests for temperature and axis




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 commented on a change in pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
Zha0q1 commented on a change in pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#discussion_r545525290



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -253,3 +254,16 @@ def test_onnx_export_dropout(tmp_path, dtype, p):
     M = def_model('Dropout', p=p)
     x = mx.nd.array([[3,0.5,-0.5,2,7],[2,-0.4,7,3,0.2]], dtype=dtype)
     op_export_test('Dropout', M, [x], tmp_path)
+
+@pytest.mark.parametrize('dtype', ['float16', 'float32'])
+def test_onnx_export_softmax(tmp_path, dtype):
+    x = mx.nd.random.uniform(0, 1, (2, 3, 4), dtype=dtype)

Review comment:
       sorry about nitpicking: shall we also test high dimensional + axis = middle dim cases?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 commented on a change in pull request #19691: [WIP] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
Zha0q1 commented on a change in pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#discussion_r545430693



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -205,3 +206,15 @@ def test_onnx_export_fully_connected(tmp_path, dtype, num_hidden, no_bias, flatt
     if not no_bias:
         args.append(mx.nd.random.uniform(0,1,(num_hidden,)))
     op_export_test('FullyConnected', M, args, tmp_path)
+
+@pytest.mark.parametrize('dtype', ['float32', 'float64'])
+def test_onnx_export_softmax(tmp_path, dtype):
+    x = mx.nd.random.uniform(0, 1, (2, 3, 4), dtype=dtype)
+    M1 = def_model('softmax')
+    op_export_test('softmax_1', M1, [x], tmp_path)
+    M2 = def_model('softmax', use_length=True, axis=0)
+    l2 = mx.nd.array([[2,0,3,1],[1,3,2,0], [0,0,0,1]], dtype=int)

Review comment:
       gotcha 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 commented on a change in pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
Zha0q1 commented on a change in pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#discussion_r545525529



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -253,3 +254,16 @@ def test_onnx_export_dropout(tmp_path, dtype, p):
     M = def_model('Dropout', p=p)
     x = mx.nd.array([[3,0.5,-0.5,2,7],[2,-0.4,7,3,0.2]], dtype=dtype)
     op_export_test('Dropout', M, [x], tmp_path)
+
+@pytest.mark.parametrize('dtype', ['float16', 'float32'])
+def test_onnx_export_softmax(tmp_path, dtype):
+    x = mx.nd.random.uniform(0, 1, (2, 3, 4), dtype=dtype)

Review comment:
       We might also parametrize temperature?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 commented on pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
Zha0q1 commented on pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#issuecomment-748558599


   Thanks!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 commented on a change in pull request #19691: [WIP] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
Zha0q1 commented on a change in pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#discussion_r545405721



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -205,3 +206,15 @@ def test_onnx_export_fully_connected(tmp_path, dtype, num_hidden, no_bias, flatt
     if not no_bias:
         args.append(mx.nd.random.uniform(0,1,(num_hidden,)))
     op_export_test('FullyConnected', M, args, tmp_path)
+
+@pytest.mark.parametrize('dtype', ['float32', 'float64'])
+def test_onnx_export_softmax(tmp_path, dtype):
+    x = mx.nd.random.uniform(0, 1, (2, 3, 4), dtype=dtype)
+    M1 = def_model('softmax')
+    op_export_test('softmax_1', M1, [x], tmp_path)
+    M2 = def_model('softmax', use_length=True, axis=0)
+    l2 = mx.nd.array([[2,0,3,1],[1,3,2,0], [0,0,0,1]], dtype=int)

Review comment:
       Shall we also use randomized inputs here?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #19691: [v1.x] ONNX fix softmax

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on pull request #19691:
URL: https://github.com/apache/incubator-mxnet/pull/19691#issuecomment-748526247


   Jenkins CI successfully triggered : [unix-gpu]


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org