You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2021/04/28 21:49:50 UTC

[GitHub] [incubator-mxnet] waytrue17 opened a new pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

waytrue17 opened a new pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226


   ## Description ##
   Add onnx conversion logic for RNN and sum_axis. Add unittest for the two.
   
   ## Checklist ##
   ### Essentials ###
   - [ ] PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage
   - [ ] Code is well-documented
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be made.
   - Interesting edge cases to note here
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] waytrue17 commented on a change in pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
waytrue17 commented on a change in pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226#discussion_r623435679



##########
File path: python/mxnet/onnx/mx2onnx/_op_translations/_op_translations_opset12.py
##########
@@ -2168,6 +2168,12 @@ def convert_sum(node, **kwargs):
         )
     return [node]
 
+@mx_op.register("sum_axis")
+def convert_sum_axis(node, **kwargs):
+    """Map MXNet's sum_axis operator.
+       sum_axis is equivalent to sum in MXNet
+    """
+    return convert_sum(node, **kwargs)

Review comment:
       Registered it under convert_sum. Thanks for the good point!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] waytrue17 commented on a change in pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
waytrue17 commented on a change in pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226#discussion_r622596256



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -1260,8 +1262,11 @@ def test_onnx_export_RNN(tmp_path, mode, dtype, state_size, input_size, num_laye
     if mode == 'lstm':
         cell = mx.nd.random.uniform(-1, 1, [num_layers, batch_size, state_size], dtype=dtype)
         op_export_test('rnn', M, [x, param, state, cell], tmp_path)
+    elif mode == 'rnn_relu':
+        # set large atol as relu can outputs big numbers
+        op_export_test('rnn', M, [x, param, state], tmp_path, atol=1e20)
     else:
-        op_export_test('rnn', M, [x, param, state], tmp_path)
+        op_export_test('rnn', M, [x, param, state], tmp_path, atol=1e-2)

Review comment:
       From my local runs, the diff is between 1e-3 to 1e-2. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 commented on a change in pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
Zha0q1 commented on a change in pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226#discussion_r622593037



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -1260,8 +1262,11 @@ def test_onnx_export_RNN(tmp_path, mode, dtype, state_size, input_size, num_laye
     if mode == 'lstm':
         cell = mx.nd.random.uniform(-1, 1, [num_layers, batch_size, state_size], dtype=dtype)
         op_export_test('rnn', M, [x, param, state, cell], tmp_path)
+    elif mode == 'rnn_relu':
+        # set large atol as relu can outputs big numbers
+        op_export_test('rnn', M, [x, param, state], tmp_path, atol=1e20)
     else:
-        op_export_test('rnn', M, [x, param, state], tmp_path)
+        op_export_test('rnn', M, [x, param, state], tmp_path, atol=1e-2)

Review comment:
       Do we know how large was the difference?

##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -1234,21 +1235,22 @@ def test_onnx_export_sequence_reverse(tmp_path, dtype, params):
 
 
 # onnx LSTM from opset 11 does not support float64
-@pytest.mark.parametrize('mode', ['lstm', 'gru'])
+@pytest.mark.parametrize('mode', ['lstm', 'gru', 'rnn_tanh', 'rnn_relu'])
 @pytest.mark.parametrize('dtype', ['float32'])
-@pytest.mark.parametrize('state_size', [16, 32])
+@pytest.mark.parametrize('state_size', [16, 32, 64])
 @pytest.mark.parametrize('input_size', [16, 32, 64])
 @pytest.mark.parametrize('num_layers', [1, 2])
 @pytest.mark.parametrize('batch_size', [1, 2, 4])
-@pytest.mark.parametrize('seq_length', [16, 32])

Review comment:
       why removing 32?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] josephevans commented on a change in pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
josephevans commented on a change in pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226#discussion_r623386267



##########
File path: python/mxnet/onnx/mx2onnx/_op_translations/_op_translations_opset12.py
##########
@@ -2168,6 +2168,12 @@ def convert_sum(node, **kwargs):
         )
     return [node]
 
+@mx_op.register("sum_axis")
+def convert_sum_axis(node, **kwargs):
+    """Map MXNet's sum_axis operator.
+       sum_axis is equivalent to sum in MXNet
+    """
+    return convert_sum(node, **kwargs)

Review comment:
       Instead of wrapping convert_sum() here, couldn't we just add the decorator to register "sum_axis" to convert_sum() ?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226#issuecomment-828802955


   Hey @waytrue17 , Thanks for submitting the PR 
   All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands: 
   - To trigger all jobs: @mxnet-bot run ci [all] 
   - To trigger specific jobs: @mxnet-bot run ci [job1, job2] 
   *** 
   **CI supported jobs**: [edge, unix-gpu, centos-cpu, unix-cpu, clang, centos-gpu, windows-gpu, windows-cpu, miscellaneous, sanity, website]
   *** 
   _Note_: 
    Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin. 
   All CI tests must pass before the PR can be merged. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] waytrue17 commented on a change in pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
waytrue17 commented on a change in pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226#discussion_r622595738



##########
File path: tests/python-pytest/onnx/test_operators.py
##########
@@ -1234,21 +1235,22 @@ def test_onnx_export_sequence_reverse(tmp_path, dtype, params):
 
 
 # onnx LSTM from opset 11 does not support float64
-@pytest.mark.parametrize('mode', ['lstm', 'gru'])
+@pytest.mark.parametrize('mode', ['lstm', 'gru', 'rnn_tanh', 'rnn_relu'])
 @pytest.mark.parametrize('dtype', ['float32'])
-@pytest.mark.parametrize('state_size', [16, 32])
+@pytest.mark.parametrize('state_size', [16, 32, 64])
 @pytest.mark.parametrize('input_size', [16, 32, 64])
 @pytest.mark.parametrize('num_layers', [1, 2])
 @pytest.mark.parametrize('batch_size', [1, 2, 4])
-@pytest.mark.parametrize('seq_length', [16, 32])

Review comment:
       Saw the same assertion issue with large seq_length/state_size/input_size. Will need to further decrease one of them to pass `rnn`, just like what we did before for `lstm` and `gru`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] Zha0q1 merged pull request #20226: [v1.x] ONNX export support for RNN and sum_axis

Posted by GitBox <gi...@apache.org>.
Zha0q1 merged pull request #20226:
URL: https://github.com/apache/incubator-mxnet/pull/20226


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org