You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2021/03/30 17:35:55 UTC

[GitHub] [incubator-mxnet] barry-jin opened a new pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

barry-jin opened a new pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105


   ## Description ##
   part4 of #20096
   Benchmarks
   npx.embedding
   ```python
   setup = """
   from mxnet import np, npx
   npx.set_np()
   inp = np.arange(0, 4, dtype=np.int32).reshape(2, 2)
   vec = np.ones((4, 32))
   """
   stmt = """
   out = npx.embedding(inp, vec, 4, 32)
   """
   timer = timeit.Timer(setup=setup,
                        stmt=stmt)
   num_repeat = 1000
   print(min(timer.repeat(repeat=10, number=num_repeat)) / num_repeat)
   
   legacy
   0.00011503259599999982
   New FFI
   7.77029469999997e-05
   ```
   
   npx.topk
   ```python
   setup = """
   from mxnet import np, npx
   npx.set_np()
   inp = np.ones((2, 10))
   """
   stmt = """
   out = npx.topk(inp, k=2, axis=None, ret_typ="indices", is_ascend=False, dtype="float32")
   """
   timer = timeit.Timer(setup=setup,
                        stmt=stmt)
   num_repeat = 1000
   print(min(timer.repeat(repeat=10, number=num_repeat)) / num_repeat)
   
   legacy
   0.0001376656030000003
   New FFI
   7.502453999999981e-05
   ```
   
   npx.layer_norm
   ```python
   setup = """
   from mxnet import np, npx
   npx.set_np()
   inp = np.ones((2, 10))
   gamma=np.ones((2))
   beta=np.zeros((2))
   """
   stmt = """
   out = npx.layer_norm(inp, gamma=gamma, beta=beta, axis=0, eps=1e5, output_mean_var=False)
   """
   timer = timeit.Timer(setup=setup,
                        stmt=stmt)
   num_repeat = 1000
   print(min(timer.repeat(repeat=10, number=num_repeat)) / num_repeat)
   
   legacy
   0.00012714471100000013
   New FFI
   6.810771400000038e-05
   ```
   
   npx.leaky_relu
   ```python
   setup = """
   from mxnet import np, npx
   npx.set_np()
   inp = -1 * mx.np.ones(shape=(2, 10))
   """
   stmt = """
   out = mx.npx.leaky_relu(inp, act_type="leaky", slope=0.3, lower_bound=0.125, upper_bound=0.334)
   """
   timer = timeit.Timer(setup=setup,
                        stmt=stmt)
   num_repeat = 1000
   print(min(timer.repeat(repeat=10, number=num_repeat)) / num_repeat)
   
   legacy
   0.00011906070800000013
   New FFI
   5.249297100000039e-05
   ```
   ## Checklist ##
   ### Essentials ###
   - [x] PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
   - [x] Changes are complete (i.e. I finished coding on this PR)
   - [x] All changes have test coverage
   - [x] Code is well-documented
   
   ### Changes ###
   - [ ] Feature1, tests, (and when applicable, API doc)
   - [ ] Feature2, tests, (and when applicable, API doc)
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be made.
   - Interesting edge cases to note here
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] szha commented on a change in pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
szha commented on a change in pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#discussion_r610247827



##########
File path: tests/python/unittest/test_numpy_gluon.py
##########
@@ -602,3 +602,57 @@ def test_pixelshuffle3d():
             [64, 88, 65, 89, 66, 90, 67, 91],
             [68, 92, 69, 93, 70, 94, 71, 95]]]]]
     )
+
+@use_np
+def test_embedding():
+    def check_embedding():
+        layer = gluon.nn.Embedding(10, 100)
+        layer.initialize()
+        x = mx.np.array([3,4,2,0,1])
+        with mx.autograd.record():
+            y = layer(x)
+            y.backward()
+        assert (layer.weight.grad().asnumpy()[:5] == 1).all()
+        assert (layer.weight.grad().asnumpy()[5:] == 0).all()
+
+    def check_embedding_large_input():
+        embedding = mx.gluon.nn.Embedding(10, 1)
+        embedding.initialize()
+        embedding.hybridize()
+        shape = (20481,)
+        with mx.autograd.record():
+            emb_in = embedding(mx.np.ones(shape))
+            loss = emb_in.sum()
+        loss.backward()
+        assert embedding.weight.grad().sum().item() == 20481
+
+    check_embedding()
+    check_embedding_large_input()
+
+@use_np
+def test_layernorm():
+    layer = nn.LayerNorm(in_channels=10)
+    check_layer_forward(layer, (2, 10, 10, 10))
+
+def check_layer_forward(layer, dshape):

Review comment:
       you can use `@pytest.mark.parametrize('arg1,arg2', [(arg1_val, arg2_val)])` to parameterize the test instead of having a wrapper test function.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] szha commented on a change in pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
szha commented on a change in pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#discussion_r610245192



##########
File path: python/mxnet/ndarray/numpy_extension/_op.py
##########
@@ -354,9 +355,12 @@ def batch_norm(x, gamma, beta, running_mean, running_var, eps=1e-3, momentum=0.9
     out : NDArray or list of NDArrays
         The output of this function.
     """
-    return _api_internal.batch_norm(x, gamma, beta, running_mean, running_var, eps, momentum,
-                                    fix_gamma, use_global_stats, output_mean_var, axis,
-                                    cudnn_off, min_calib_range, max_calib_range)
+    out = _api_internal.batch_norm(x, gamma, beta, running_mean, running_var, eps, momentum,
+                                   fix_gamma, use_global_stats, output_mean_var, axis,
+                                   cudnn_off, min_calib_range, max_calib_range)
+    if isinstance(out, NDArrayBase):
+        return out
+    return list(out)

Review comment:
       what does this branch handle?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#issuecomment-810447378


   Hey @barry-jin , Thanks for submitting the PR 
   All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands: 
   - To trigger all jobs: @mxnet-bot run ci [all] 
   - To trigger specific jobs: @mxnet-bot run ci [job1, job2] 
   *** 
   **CI supported jobs**: [windows-gpu, unix-gpu, clang, edge, windows-cpu, centos-cpu, website, centos-gpu, unix-cpu, miscellaneous, sanity]
   *** 
   _Note_: 
    Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin. 
   All CI tests must pass before the PR can be merged. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] barry-jin commented on pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
barry-jin commented on pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#issuecomment-811374670


   @mxnet-bot run ci [centos-cpu, unix-cpu, unix-gpu, windows-cpu, windows-gpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] szha merged pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
szha merged pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] szha commented on a change in pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
szha commented on a change in pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#discussion_r610245572



##########
File path: python/mxnet/ndarray/numpy_extension/_op.py
##########
@@ -354,9 +355,12 @@ def batch_norm(x, gamma, beta, running_mean, running_var, eps=1e-3, momentum=0.9
     out : NDArray or list of NDArrays
         The output of this function.
     """
-    return _api_internal.batch_norm(x, gamma, beta, running_mean, running_var, eps, momentum,
-                                    fix_gamma, use_global_stats, output_mean_var, axis,
-                                    cudnn_off, min_calib_range, max_calib_range)
+    out = _api_internal.batch_norm(x, gamma, beta, running_mean, running_var, eps, momentum,
+                                   fix_gamma, use_global_stats, output_mean_var, axis,
+                                   cudnn_off, min_calib_range, max_calib_range)
+    if isinstance(out, NDArrayBase):
+        return out
+    return list(out)

Review comment:
       this seems to be a common pattern




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] barry-jin commented on pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
barry-jin commented on pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#issuecomment-819957584


   @mxnet-bot run ci [unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] barry-jin commented on a change in pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
barry-jin commented on a change in pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#discussion_r610274484



##########
File path: python/mxnet/ndarray/numpy_extension/_op.py
##########
@@ -354,9 +355,12 @@ def batch_norm(x, gamma, beta, running_mean, running_var, eps=1e-3, momentum=0.9
     out : NDArray or list of NDArrays
         The output of this function.
     """
-    return _api_internal.batch_norm(x, gamma, beta, running_mean, running_var, eps, momentum,
-                                    fix_gamma, use_global_stats, output_mean_var, axis,
-                                    cudnn_off, min_calib_range, max_calib_range)
+    out = _api_internal.batch_norm(x, gamma, beta, running_mean, running_var, eps, momentum,
+                                   fix_gamma, use_global_stats, output_mean_var, axis,
+                                   cudnn_off, min_calib_range, max_calib_range)
+    if isinstance(out, NDArrayBase):
+        return out
+    return list(out)

Review comment:
       Because the out can be NDArraBase type or ADT type. This branch will return out directly if its NDArray, otherwise convert ADT to python list and return list of NDArrays. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#issuecomment-811374764


   Jenkins CI successfully triggered : [unix-gpu, windows-cpu, unix-cpu, windows-gpu, centos-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #20105: [FFI] part4: npx.embedding, npx.topk, npx.layer_norm, npx.leaky_relu

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on pull request #20105:
URL: https://github.com/apache/incubator-mxnet/pull/20105#issuecomment-819957610


   Jenkins CI successfully triggered : [unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org