You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/09/16 07:53:53 UTC

[GitHub] xuzhenqi opened a new issue #12575: How to use tensorrt in FP16 or Int8 mode?

xuzhenqi opened a new issue #12575: How to use tensorrt in FP16 or Int8 mode?
URL: https://github.com/apache/incubator-mxnet/issues/12575
 
 
   Mxnet1.3.0 supports tensorrt for inference, but I do not find any tutorials or examples to show how to inference a model in FP16 or Int8 mode. So is there a way to do it? 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services