You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/07/01 17:43:54 UTC

[GitHub] [incubator-mxnet] ma-hei removed a comment on pull request #18445: updating ubuntu_cpu base image to 20.04 to observe failing tests regarding Python 3.8

ma-hei removed a comment on pull request #18445:
URL: https://github.com/apache/incubator-mxnet/pull/18445#issuecomment-649099499


   @Roshrini I found an issue when updating onnx from 1.5.0 to 1.7.0. The issue can be reproduced with python 3.6. The following code reproduces the issue. Do you have any idea what's going on?
   ```
   import numpy as np
   from onnx import TensorProto
   from onnx import helper
   from onnx import mapping
   from mxnet.contrib.onnx.onnx2mx.import_onnx import GraphProto
   from mxnet.contrib.onnx.mx2onnx.export_onnx import MXNetGraph
   import mxnet as mx
   
   inputshape = (2, 3, 20, 20)
   input_tensor = [helper.make_tensor_value_info("input1", TensorProto.FLOAT, shape = inputshape)]
   
   outputshape = (2, 3, 17, 16)
   output_tensor = [helper.make_tensor_value_info("output", TensorProto.FLOAT, shape=outputshape)]
   
   onnx_attrs = {'kernel_shape': (4, 5), 'pads': (0, 0), 'strides': (1, 1), 'p': 1}
   nodes = [helper.make_node("LpPool", ["input1"], ["output"], **onnx_attrs)]
   
   graph = helper.make_graph(nodes, "test_lppool1", input_tensor, output_tensor)
   
   onnxmodel = helper.make_model(graph)
   
   graph = GraphProto()
   
   ctx = mx.cpu()
   
   sym, arg_params, aux_params = graph.from_onnx(onnxmodel.graph)
   
   metadata = graph.get_graph_metadata(onnxmodel.graph)
   input_data = metadata['input_tensor_data']
   input_shape = [data[1] for data in input_data]
   
   """ Import ONNX model to mxnet model and then export to ONNX model
   and then import it back to mxnet for verifying the result"""
   
   params = {}
   params.update(arg_params)
   params.update(aux_params)
   converter = MXNetGraph()
   
   graph_proto = converter.create_onnx_graph_proto(sym, params, in_shape=input_shape, in_type=mapping.NP_TYPE_TO_TENSOR_TYPE[np.dtype('float32')])
   ```
   The line that is throwing the error is:
   ```
   graph_proto = converter.create_onnx_graph_proto(sym, params, in_shape=input_shape, in_type=mapping.NP_TYPE_TO_TENSOR_TYPE[np.dtype('float32')])
   ```
   The error I'm seeing is:
   ```
     File "/opt/anaconda3/envs/p36/lib/python3.6/site-packages/onnx/checker.py", line 54, in checker
       proto.SerializeToString(), ctx)
   onnx.onnx_cpp2py_export.checker.ValidationError: Node (pad0) has input size 1 not in range [min=2, max=3].
   
   ==> Context: Bad node spec: input: "input1" output: "pad0" name: "pad0" op_type: "Pad" attribute { name: "mode" s: "constant" type: STRING } attribute { name: "pads" ints: 0 ints: 0 ints: 0 ints: 0 ints: 0 ints: 0 ints: 0 ints: 0 type: INTS } attribute { name: "value" f: 0 type: FLOAT }
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org