You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/07/01 23:43:31 UTC

[GitHub] [incubator-mxnet] sandeep-krishnamurthy commented on a change in pull request #15403: Updating profiler tutorial to include new custom operator profiling

sandeep-krishnamurthy commented on a change in pull request #15403: Updating profiler tutorial to include new custom operator profiling
URL: https://github.com/apache/incubator-mxnet/pull/15403#discussion_r299255615
 
 

 ##########
 File path: docs/tutorials/python/profiler.md
 ##########
 @@ -206,6 +206,63 @@ Let's zoom in to check the time taken by operators
 
 The above picture visualizes the sequence in which the operators were executed and the time taken by each operator.
 
+### Profiling Custom Operators
+Should the existing NDArray operators fail to meet all your model's needs, MXNet supports [Custom Operators](https://mxnet.incubator.apache.org/versions/master/tutorials/gluon/customop.html) that you can define in Python. In `forward()` and `backward()` of a custom operator, there are two kinds of code: "pure Python" code (NumPy operators included) and "sub-operators" (NDArray operators called within `forward()` and `backward()`). With that said, MXNet can profile the execution time of both kinds without additional setup. Specifically, the MXNet profiler will break a single custom operator call into a pure Python event and several sub-operator events if there are any. Furthermore, all of those events will have a prefix in their names, which is, conveniently, the name of the custom operator you called.
+
+Let's try profiling custom operators with the following code example:
+
+```python
+
+import mxnet as mx
+from mxnet import nd
+from mxnet import profiler
+
+class MyAddOne(mx.operator.CustomOp):
+    def forward(self, is_train, req, in_data, out_data, aux):  
+        self.assign(out_data[0], req[0], in_data[0]+1)
+
+    def backward(self, req, out_grad, in_data, out_data, in_grad, aux):
+        self.assign(in_grad[0], req[0], out_grad[0])
+
+@mx.operator.register('MyAddOne')
+class CustomAddOneProp(mx.operator.CustomOpProp):
+    def __init__(self):
+        super(CustomAddOneProp, self).__init__(need_top_grad=True)
+
+    def list_arguments(self):
+        return ['data']
+
+    def list_outputs(self):
+        return ['output']
+
+    def infer_shape(self, in_shape):
+        return [in_shape[0]], [in_shape[0]], []
+
+    def create_operator(self, ctx, shapes, dtypes):
+        return MyAddOne()
+
+
+inp = mx.nd.zeros(shape=(500, 500))
+
+profiler.set_config(profile_all=True, continuous_dump = True)
+profiler.set_state('run')
+
+w = nd.Custom(inp, op_type="MyAddOne")
+
+mx.nd.waitall()
+
+profiler.set_state('stop')
+profiler.dump()
+```
+
+Here, we have created a custom operator called `MyAddOne`, and within its `foward()` function, we simply add one to the input. We can visualize the dump file in `chrome://tracing/`:
 
 Review comment:
    nit: spell check. 'foward()' -> 'forward()'

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services