You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2023/01/11 02:35:22 UTC

[GitHub] [tvm] guojilei opened a new issue, #13757: When the pad value of onnx is -1, tvm inference fails and onnxruntime inferences normally

guojilei opened a new issue, #13757:
URL: https://github.com/apache/tvm/issues/13757

   I have an onnx model where the pad value of the convolution is -1, so it works fine with onnxruntime, but with tvm inference it gets an error.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [tvm] chengven027-intellif commented on issue #13757: When the pad value of onnx is -1, tvm inference fails and onnxruntime inferences normally

Posted by GitBox <gi...@apache.org>.
chengven027-intellif commented on issue #13757:
URL: https://github.com/apache/tvm/issues/13757#issuecomment-1378195060

   hi, @guojilei  Can you provide your onnx model?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org