You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2019/10/21 17:11:16 UTC

[GitHub] [incubator-mxnet] marfago opened a new issue #16568: Different (not uniform) behavior in RMSE, MSE, MAE

marfago opened a new issue #16568: Different (not uniform) behavior in RMSE,MSE,MAE
URL: https://github.com/apache/incubator-mxnet/issues/16568
 
 
   Looking at metrics it seems like they behave in a slightly different way depending on the how they sum up all the batch inputs. For example, Accuracy does not calculates the mean of single batches, but sums all them up and evaluates a single mean in the get method:
   
   https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/metric.py#L507
   
   On the contrary, RMSE evaluates the single RMSE for each batch, and then the average in the get method.
   
   https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/metric.py#L1273
   
   This leads to some discrepancy in the metrics for example when using metrics which are non-linear (f.i. MAE, MSE, RMSE, PCC) where metric(samples)!=mean(metric(batch_samples)).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services