You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/02/05 08:38:27 UTC

[GitHub] lupesko commented on a change in pull request #14041: modifying SyncBN doc for FP16 use case

lupesko commented on a change in pull request #14041: modifying SyncBN doc for FP16 use case
URL: https://github.com/apache/incubator-mxnet/pull/14041#discussion_r253771427
 
 

 ##########
 File path: python/mxnet/gluon/contrib/nn/basic_layers.py
 ##########
 @@ -165,7 +165,10 @@ class SyncBatchNorm(BatchNorm):
 
     Standard BN [1]_ implementation only normalize the data within each device.
     SyncBN normalizes the input within the whole mini-batch.
-    We follow the sync-onece implmentation described in the paper [2]_.
+    We follow the implementation described in the paper [2]_.
+
+    Note: Current implementation of SyncBN does not support FP16 training.
 
 Review comment:
   So does it not support training or inference?
   You say training in this line, and refer to inference in the next line.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services