You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/07/13 20:13:35 UTC

[GitHub] [incubator-mxnet] szha commented on issue #18697: AMP for mx2

szha commented on issue #18697:
URL: https://github.com/apache/incubator-mxnet/issues/18697#issuecomment-657767388


   @ptrendx the reasoning is that we shouldn't need to optimize pure imperative program, because that is designed for model debugging and development, and not for performance.
   
   That said, I recognize that there are still cases where we can't hybridize fully yet and still need performance. One short-term approach could be to utilize the [`mx.set_np_default_dtype`](https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/util.py#L1108-L1146) interface. The problem it solves is to switch between fp32 and fp64 as default output data type, as the former is commonly used in deep learning and the later is used in numpy. It should be straightforward to add more data types to it.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org