You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/10/06 07:35:33 UTC
[jira] [Commented] (SPARK-3568) Add metrics for ranking algorithms
[ https://issues.apache.org/jira/browse/SPARK-3568?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14160022#comment-14160022 ]
Apache Spark commented on SPARK-3568:
-------------------------------------
User 'coderxiang' has created a pull request for this issue:
https://github.com/apache/spark/pull/2667
> Add metrics for ranking algorithms
> ----------------------------------
>
> Key: SPARK-3568
> URL: https://issues.apache.org/jira/browse/SPARK-3568
> Project: Spark
> Issue Type: New Feature
> Components: MLlib
> Reporter: Shuo Xiang
> Assignee: Shuo Xiang
>
> Include common metrics for ranking algorithms (http://www-nlp.stanford.edu/IR-book/), including:
> - Mean Average Precision
> - Precision@n: top-n precision
> - Discounted cumulative gain (DCG) and NDCG
> This implementation attempts to create a new class called *RankingMetrics* under *org.apache.spark.mllib.evaluation*, which accepts input (prediction and label pairs) as *RDD[Array[Double], Array[Double]]*. The following methods will be implemented:
> {code:title=RankingMetrics.scala|borderStyle=solid}
> class RankingMetrics(predictionAndLabels: RDD[(Array[Double], Array[Double])]) {
> /* Returns the precsion@k for each query */
> lazy val precAtK: RDD[Array[Double]]
> /* Returns the average precision for each query */
> lazy val avePrec: RDD[Double]
> /*Returns the mean average precision (MAP) of all the queries*/
> lazy val meanAvePrec: Double
> /*Returns the normalized discounted cumulative gain for each query */
> lazy val ndcg: RDD[Double]
> /* Returns the mean NDCG of all the queries */
> lazy val meanNdcg: Double
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org