You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/02/19 00:33:53 UTC

[GitHub] eric-haibin-lin opened a new issue #14198: Dropout inference performance

eric-haibin-lin opened a new issue #14198: Dropout inference performance
URL: https://github.com/apache/incubator-mxnet/issues/14198
 
 
   Dropout is a commonly used component for NN training. Currently it is however, not well optimized for inference. Currently for inference, the dropout op produces two outputs (out, mask), and copies the input to one of them (out). Ideally for inference we should
   1. check `kWriteInplace` and avoid the copy
   2. produces a single output instead of two 
   
   For (1), it should be straightforward - just check req in the op implementation (like the nd.identity op https://github.com/apache/incubator-mxnet/blob/30655f9ef58daa778d6ea049940068f0ff9a21bf/src/operator/tensor/elemwise_unary_op.h#L305). For (2), however, we need to know runtime (is_train=False/True) information, which is not available when inferring the number of outputs in the op. @junrushao1994 we should also consider this information for backend op registration. 
   
   cc @TaoLv @pengzhao-intel 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services