You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/12/09 00:43:48 UTC

[GitHub] [incubator-mxnet] kice commented on issue #16590: import_onnx.py parser for onnx opset >= 9 has bug

kice commented on issue #16590: import_onnx.py parser for onnx opset >= 9 has bug
URL: https://github.com/apache/incubator-mxnet/issues/16590#issuecomment-563018787
 
 
   In my case, I use pytorch to export onnx with opset less than 8, it still cannot import the model to mxnet. 
   
   ```
   >>>mxnet.__version__
   '1.6.0'
   >>> torch.__version__
   '1.3.1'
   >>> onnx.__version__
   '1.6.0'
   ```
   
   ```
   onnx_model = onnx.load(model_file)
   for node in onnx_model.graph.node:
       for i in node.input:
           print(i)
   ```
   Here is the output
   ```
   data
   head.0.weight
   head.0.bias
   19
   body.0.body.0.weight
   body.0.body.0.bias
   20
   21
   body.0.body.2.weight
   body.0.body.2.bias
   22
   19
   23
   body.1.body.0.weight
   body.1.body.0.bias
   24
   25
   body.1.body.2.weight
   body.1.body.2.bias
   26
   23
   27
   body.2.weight
   body.2.bias
   28
   19
   29
   tail.0.0.weight
   tail.0.0.bias
   31
   30
   43
   32
   34
   33
   44
   35
   tail.0.2.weight
   tail.0.2.bias
   37
   36
   45
   38
   40
   39
   46
   41
   tail.1.weight
   tail.1.bias
   ```
   
   Here is error when I import the model.
   ```
   c:\program files\python37\lib\site-packages\mxnet\contrib\onnx\onnx2mx\import_model.py in import_model(model_file)
        57     # loads model file and returns ONNX protobuf object
        58     model_proto = onnx.load_model(model_file)
   ---> 59     sym, arg_params, aux_params = graph.from_onnx(model_proto.graph)
        60     return sym, arg_params, aux_params
        61 
   
   c:\program files\python37\lib\site-packages\mxnet\contrib\onnx\onnx2mx\import_onnx.py in from_onnx(self, graph)
       113             node_name = node_name if node_name else None
       114             onnx_attr = self._parse_attr(node.attribute)
   --> 115             inputs = [self._nodes[i] for i in node.input]
       116             mxnet_sym = self._convert_operator(node_name, op_name, onnx_attr, inputs)
       117 
   
   c:\program files\python37\lib\site-packages\mxnet\contrib\onnx\onnx2mx\import_onnx.py in <listcomp>(.0)
       113             node_name = node_name if node_name else None
       114             onnx_attr = self._parse_attr(node.attribute)
   --> 115             inputs = [self._nodes[i] for i in node.input]
       116             mxnet_sym = self._convert_operator(node_name, op_name, onnx_attr, inputs)
       117 
   
   KeyError: 'head.0.weight'
   ```
   
   I even try to add the model parameters to the input names for pytorch to export,
   ```
   input_names = ['data'] + list(model.state_dict().keys())
   torch.onnx.export(model,                     # model being run
       x,                         # model input (or a tuple for multiple inputs)
       onnx_name,                 # where to save the model (can be a file or file-like object)
       export_params=True,        # store the trained parameter weights inside the model file
       opset_version=opset,       # the ONNX version to export the model to
       do_constant_folding=True,  # whether to execute constant folding for optimization
       input_names = input_names, # the model's input names
       output_names = ['output']  # the model's output names
   )
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services