You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/02/03 09:17:23 UTC

[GitHub] [tvm] euntaik opened a new pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

euntaik opened a new pull request #7400:
URL: https://github.com/apache/tvm/pull/7400


   get input tensor information from graph 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573533135



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3588,8 +3643,14 @@ def from_tflite(model, shape_dict, dtype_dict):
     exp_tab = ExprTable()
     for model_input in model_inputs:
         model_input_name = get_tensor_name(subgraph, model_input)
-        shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
-        dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        if shape_dict:
+            shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
+        else:
+            shape = get_tensor_shape(subgraph, model_input)
+        if dtype_dict:
+            dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        else:
+            dtype = get_tensor_type(subgraph, model_input)

Review comment:
       
   
   
   > We have a similar function, that collect the same information being proposed here in TVMC. I agree we should move what is in there, to unify functionality here.
   
   Oh, it was there all along. I was loading my models in a separate script to put the relay output into my compile passes.
   
   
   > Can you have a look on the function I'm pointing here (below) and spot why are they so different, 
   
   I don't see much difference except that your code accounts for models with more than one subgraph.
   
   > and in case you agree on what's the best approach, improve it here and remove it there?
   
   My rationale behind making and putting this code in the tflite.py file was:
   1. use the data in the graph since it is already embedded in it.
   2. place the code inside the frontend code since it is dependent on the frontend. 
   
   
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573808458



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3539,7 +3539,45 @@ def get_tensor_name(subgraph, tensor_idx):
     return subgraph.Tensors(tensor_idx).Name().decode("utf-8")
 
 
-def from_tflite(model, shape_dict, dtype_dict):
+def _decode_type(n):
+    _tflite_m = {

Review comment:
       I see this is duplicated in tvmc/frontends.py - is there any reason why we can't reuse this one there?

##########
File path: python/tvm/driver/tvmc/frontends.py
##########
@@ -241,43 +241,10 @@ def load(self, path, shape_dict=None):
         if version != 3:
             raise TVMCException("input file not tflite version 3")
 
-        logger.debug("tflite_input_type")
-        input_shapes, dtype_dict = TFLiteFrontend._input_type(tflite_model)
-        if shape_dict is not None:
-            input_shapes.update(shape_dict)
-
         logger.debug("parse TFLite model and convert into Relay computation graph")
-        mod, params = relay.frontend.from_tflite(
-            tflite_model, shape_dict=input_shapes, dtype_dict=dtype_dict
-        )
+        mod, params = relay.frontend.from_tflite(tflite_model)

Review comment:
       Since we merged #7366, users are able to provide shapes in tvmc from outside, can you have a look on that one and adjust?
   
   ```suggestion
           mod, params = relay.frontend.from_tflite(tflite_model, shape_dict=input_shapes, dtype_dict=dtype_dict)
   ```
   
   cc @CircleSpin @hogepodge  to help




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573533135



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3588,8 +3643,14 @@ def from_tflite(model, shape_dict, dtype_dict):
     exp_tab = ExprTable()
     for model_input in model_inputs:
         model_input_name = get_tensor_name(subgraph, model_input)
-        shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
-        dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        if shape_dict:
+            shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
+        else:
+            shape = get_tensor_shape(subgraph, model_input)
+        if dtype_dict:
+            dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        else:
+            dtype = get_tensor_type(subgraph, model_input)

Review comment:
       
   
   > We have a similar function, that collect the same information being proposed here in TVMC. I agree we should move what is in there, to unify functionality here.
   
   Oh, it was there all along. I think I missed your code since I was loading my models in a separate script to put the relay output into my compile passes.
   
   
   > Can you have a look on the function I'm pointing here (below) and spot why are they so different, 
   
   I don't see much difference except that your code accounts for models with more than one subgraph.
   
   > and in case you agree on what's the best approach, improve it here and remove it there?
   
   My rationale behind making and putting this code in the tflite.py file was:
   1. use the data in the graph since it is already embedded in it.
   2. place the code inside the frontend code since it is dependent on the frontend. 
   
   
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573843007



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3539,7 +3539,45 @@ def get_tensor_name(subgraph, tensor_idx):
     return subgraph.Tensors(tensor_idx).Name().decode("utf-8")
 
 
-def from_tflite(model, shape_dict, dtype_dict):
+def _decode_type(n):
+    _tflite_m = {

Review comment:
       I missed it. Fixed.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573275644



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3539,7 +3539,62 @@ def get_tensor_name(subgraph, tensor_idx):
     return subgraph.Tensors(tensor_idx).Name().decode("utf-8")
 
 
-def from_tflite(model, shape_dict, dtype_dict):
+def get_tensor_shape(subgraph, tensor_idx):
+    """Get the tensor shape.

Review comment:
       * minor: the name of the argument doesn't match with the actual argument
   * the types are not specified
   
   Please review all docstring being introduced here for those items above.

##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3588,8 +3643,14 @@ def from_tflite(model, shape_dict, dtype_dict):
     exp_tab = ExprTable()
     for model_input in model_inputs:
         model_input_name = get_tensor_name(subgraph, model_input)
-        shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
-        dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        if shape_dict:
+            shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
+        else:
+            shape = get_tensor_shape(subgraph, model_input)
+        if dtype_dict:
+            dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        else:
+            dtype = get_tensor_type(subgraph, model_input)

Review comment:
       We have a similar function, that collect the same information being proposed here in TVMC. I agree we should move what is in there, to unify functionality here.
   
   Can you have a look on the function I'm pointing here (below) and spot why are they so different, and in case you agree on what's the best approach, improve it here and remove it there?
   
   https://github.com/apache/tvm/blob/2999d03284c74f6840503ae3b880d3579a76f1af/python/tvm/driver/tvmc/frontends.py#L255-L278
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573867907



##########
File path: python/tvm/driver/tvmc/frontends.py
##########
@@ -241,43 +241,10 @@ def load(self, path, shape_dict=None):
         if version != 3:
             raise TVMCException("input file not tflite version 3")
 
-        logger.debug("tflite_input_type")
-        input_shapes, dtype_dict = TFLiteFrontend._input_type(tflite_model)
-        if shape_dict is not None:
-            input_shapes.update(shape_dict)
-
         logger.debug("parse TFLite model and convert into Relay computation graph")
-        mod, params = relay.frontend.from_tflite(
-            tflite_model, shape_dict=input_shapes, dtype_dict=dtype_dict
-        )
+        mod, params = relay.frontend.from_tflite(tflite_model)

Review comment:
       You are right. Sorry for that.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573844512



##########
File path: python/tvm/driver/tvmc/frontends.py
##########
@@ -241,43 +241,10 @@ def load(self, path, shape_dict=None):
         if version != 3:
             raise TVMCException("input file not tflite version 3")
 
-        logger.debug("tflite_input_type")
-        input_shapes, dtype_dict = TFLiteFrontend._input_type(tflite_model)
-        if shape_dict is not None:
-            input_shapes.update(shape_dict)
-
         logger.debug("parse TFLite model and convert into Relay computation graph")
-        mod, params = relay.frontend.from_tflite(
-            tflite_model, shape_dict=input_shapes, dtype_dict=dtype_dict
-        )
+        mod, params = relay.frontend.from_tflite(tflite_model)

Review comment:
       is the `from_tflite()` now duplicated? (I just looked quickly, might be wrong)




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron edited a comment on pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron edited a comment on pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#issuecomment-779085662


   @mbaret @FrozenGene can you have a look on this one, and merge if you think it is ok?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573551305



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3588,8 +3643,14 @@ def from_tflite(model, shape_dict, dtype_dict):
     exp_tab = ExprTable()
     for model_input in model_inputs:
         model_input_name = get_tensor_name(subgraph, model_input)
-        shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
-        dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        if shape_dict:
+            shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
+        else:
+            shape = get_tensor_shape(subgraph, model_input)
+        if dtype_dict:
+            dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        else:
+            dtype = get_tensor_type(subgraph, model_input)

Review comment:
       Cool. I think we both agree that it is better to have the funcionality only in the tflite.py, and remove it from TVMC.
   
   So I suggest we keep the one that accounts for many subgraphs, and move it from TVMC to the official frontend? If you agree, feel free to do it in this PR.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573843547



##########
File path: python/tvm/driver/tvmc/frontends.py
##########
@@ -241,43 +241,10 @@ def load(self, path, shape_dict=None):
         if version != 3:
             raise TVMCException("input file not tflite version 3")
 
-        logger.debug("tflite_input_type")
-        input_shapes, dtype_dict = TFLiteFrontend._input_type(tflite_model)
-        if shape_dict is not None:
-            input_shapes.update(shape_dict)
-
         logger.debug("parse TFLite model and convert into Relay computation graph")
-        mod, params = relay.frontend.from_tflite(
-            tflite_model, shape_dict=input_shapes, dtype_dict=dtype_dict
-        )
+        mod, params = relay.frontend.from_tflite(tflite_model)

Review comment:
       fixed it




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron commented on pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron commented on pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#issuecomment-779084245


   > Any other comments?
   
   Sorry I forgot to check this again after CI.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on a change in pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on a change in pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#discussion_r573581898



##########
File path: python/tvm/relay/frontend/tflite.py
##########
@@ -3588,8 +3643,14 @@ def from_tflite(model, shape_dict, dtype_dict):
     exp_tab = ExprTable()
     for model_input in model_inputs:
         model_input_name = get_tensor_name(subgraph, model_input)
-        shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
-        dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        if shape_dict:
+            shape = shape_dict[model_input_name] if model_input_name in shape_dict else None
+        else:
+            shape = get_tensor_shape(subgraph, model_input)
+        if dtype_dict:
+            dtype = dtype_dict[model_input_name] if model_input_name in dtype_dict else "float32"
+        else:
+            dtype = get_tensor_type(subgraph, model_input)

Review comment:
       Thanks, I will update the PR.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] mbaret merged pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
mbaret merged pull request #7400:
URL: https://github.com/apache/tvm/pull/7400


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] euntaik commented on pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
euntaik commented on pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#issuecomment-778964465


   Any other comments?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [tvm] leandron commented on pull request #7400: [FRONTEND][TFLITE] get input tensor information from graph

Posted by GitBox <gi...@apache.org>.
leandron commented on pull request #7400:
URL: https://github.com/apache/tvm/pull/7400#issuecomment-779085662


   @FrozenGene can you have a look on this one, and merge if you think it is ok?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org