You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/10/18 04:52:47 UTC
[GitHub] [incubator-mxnet] zczjx opened a new issue #16528: mxnet cannot
export the onnx model with BatchNormalization operator
zczjx opened a new issue #16528: mxnet cannot export the onnx model with BatchNormalization operator
URL: https://github.com/apache/incubator-mxnet/issues/16528
## Description
onnx_mxnet.export_model() will throw exception and failure if I use it to export my trained model which have BatchNormalization operator
## Environment info (Required)
os: Ubuntu 16 / Anaconda python3.7
mxnet 1.5.1
onnx 1.5.0
Package used (Python/R/Scala/Julia):
Python
## Error Message:
12:41:27] src/operator/nn/./cudnn/./cudnn_algoreg-inl.h:97: Running performance tests to find the best convolution algorithm, this can take a while... (set the environment variable MXNET_CUDNN_AUTOTUNE_DEFAULT to 0 to disable)
WARNING:root:Pooling: ONNX currently doesn't support pooling_convention. This might lead to shape or accuracy issues. https://github.com/onnx/onnx/issues/549
Traceback (most recent call last):
File "./model_zoo_export_onnx.py", line 25, in <module>
onnx_model_path = onnx_mxnet.export_model(syms, params, [input_shape], np.float32, onnx_file)
File "/home/zcz/anaconda3/lib/python3.7/site-packages/mxnet/contrib/onnx/mx2onnx/export_model.py", line 83, in export_model
verbose=verbose)
File "/home/zcz/anaconda3/lib/python3.7/site-packages/mxnet/contrib/onnx/mx2onnx/export_onnx.py", line 312, in create_onnx_graph_proto
checker.check_graph(graph)
File "/home/zcz/anaconda3/lib/python3.7/site-packages/onnx/checker.py", line 52, in checker
proto.SerializeToString(), ctx)
onnx.onnx_cpp2py_export.checker.ValidationError: Unrecognized attribute: spatial for operator BatchNormalization
==> Context: Bad node spec: input: "mobilenetv20_features_conv0_fwd" input: "mobilenetv20_features_batchnorm0_gamma" input: "mobilenetv20_features_batchnorm0_beta" input: "mobilenetv20_features_batchnorm0_running_mean" input: "mobilenetv20_features_batchnorm0_running_var" output: "mobilenetv20_features_batchnorm0_fwd" name: "mobilenetv20_features_batchnorm0_fwd" op_type: "BatchNormalization" attribute { name: "epsilon" f: 1e-05 type: FLOAT } attribute { name: "momentum" f: 0.9 type: FLOAT } attribute { name: "spatial" i: 0 type: INT }
## Minimum reproducible example
```
code:
#!/usr/bin/env python3
# coding: utf-8
import sys, os
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.axes as axes
from mxnet.gluon import loss as gloss, nn
from mxnet.gluon.model_zoo import vision
from mxnet.gluon import data as gdata
if __name__=='__main__':
prefix = 'model_zoo'
syms = prefix + '-symbol.json'
params = prefix + '-0000.params'
onnx_file = prefix + '.onnx'
input_shape = (1, 3, 224, 224)
mobile_net = vision.mobilenet_v2_1_0(pretrained=True, ctx=ctx)
mobile_net.hybridize()
mobile_net(mx.nd.ones(input_shape, ctx=ctx))
mobile_net.export(prefix)
# convert to onnx
onnx_model_path = onnx_mxnet.export_model(syms, params, [input_shape], np.float32, onnx_file)
# Load onnx model
model_proto = onnx.load_model(onnx_model_path)
# Check if converted ONNX protobuf is valid
checker.check_graph(model_proto.graph)
```
## What have you tried to solve it?
1. I tried to update the onnx version to 1.6.0, but it still do not fix this issue
2. I tried pytorch to export the same model, it will work OK
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services