You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/07/16 23:23:31 UTC

[GitHub] [incubator-mxnet] leezu commented on a change in pull request #18690: [WIP] optimize graph in presence of dynamic shape ops

leezu commented on a change in pull request #18690:
URL: https://github.com/apache/incubator-mxnet/pull/18690#discussion_r456130118



##########
File path: python/mxnet/symbol/symbol.py
##########
@@ -1470,6 +1470,10 @@ def optimize_for(self, backend, args=None, aux=None, ctx=None,
         ctx : Context, optional
             Device context, used to infer stypes
 
+        is_np_sym : boolean, optional
+            Output symbol type
+            - If true, output type is np symbol, otherwise nd symbol.
+

Review comment:
       nd symbol will be removed soon anyways. Can you rely on the global `is_np` state (and if needed adapt the state in the `build_cache` function)

##########
File path: python/mxnet/symbol/symbol.py
##########
@@ -2627,6 +2633,24 @@ def detach(self):
     def backward(self):
         raise NotImplementedForSymbol(self.backward, None)
 
+    def optimize_for_dynamic_shape_op(self, is_np_sym=False):

Review comment:
       Why is this a public API even though it's called automatically in `_build_cache`? Should it be private?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org