You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicholas Brett Marcott (Jira)" <ji...@apache.org> on 2020/12/21 00:54:00 UTC

[jira] [Resolved] (SPARK-32904) pyspark.mllib.evaluation.MulticlassMetrics needs to swap the results of precision( ) and recall( )

     [ https://issues.apache.org/jira/browse/SPARK-32904?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nicholas Brett Marcott resolved SPARK-32904.
--------------------------------------------
    Resolution: Not A Bug

Spark's calculations look correct.
The columns in the confusion matrix are predictions, so the total number of predictions for label = 1 is 4455 (1610 + 2845). Out of those, 1610 are true positives so, precision is TP/(TP + FP) = 1610 / 4455 ~= 0.36

Please reopen with a specific test case that fails if you see any further issues.

> pyspark.mllib.evaluation.MulticlassMetrics needs to swap the results of precision( ) and recall( )
> --------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-32904
>                 URL: https://issues.apache.org/jira/browse/SPARK-32904
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 3.0.1
>            Reporter: TinaLi
>            Priority: Major
>
> [https://spark.apache.org/docs/latest/api/java/org/apache/spark/mllib/evaluation/MulticlassMetrics.html]
> *The values returned by the precision() and recall() methods of this API should be swapped.*
> Following is the example results I got when I run this API. It prints out precision  
> metrics = MulticlassMetrics(predictionAndLabels)
> print (metrics.confusionMatrix().toArray())
> print ("precision: ",metrics.precision(1))
> print ("recall: ",metrics.recall(1))
> [[36631. 2845.]
> [ 3839. 1610.]]
> precision: 0.3613916947250281
> recall: 0.2954670581758121
>  
> predictions.select('prediction').agg(\{'prediction':'sum'}).show()
> |sum(prediction)| 5449.0|
> As you can see, my model predicted 5449 cases with label=1, and 1610 out of the 5449 cases are true positive, so precision should be  1610/5449=0.2954670581758121, but this API assigned the precision value to recall() method, which should be swapped. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org