You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/02/05 20:09:25 UTC

[GitHub] mseth10 commented on a change in pull request #14041: modifying SyncBN doc for FP16 use case

mseth10 commented on a change in pull request #14041: modifying SyncBN doc for FP16 use case
URL: https://github.com/apache/incubator-mxnet/pull/14041#discussion_r254027445
 
 

 ##########
 File path: python/mxnet/gluon/contrib/nn/basic_layers.py
 ##########
 @@ -165,7 +165,10 @@ class SyncBatchNorm(BatchNorm):
 
     Standard BN [1]_ implementation only normalize the data within each device.
     SyncBN normalizes the input within the whole mini-batch.
-    We follow the sync-onece implmentation described in the paper [2]_.
+    We follow the implementation described in the paper [2]_.
+
+    Note: Current implementation of SyncBN does not support FP16 training.
 
 Review comment:
   SyncBN does not have FP16 support for both training and inference. But for FP16 inference, SyncBN can be replaced with nn.BatchNorm as they have similar functionality.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services