You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/08/01 13:34:20 UTC
[jira] [Commented] (SPARK-16831) CrossValidator reports incorrect
avgMetrics
[ https://issues.apache.org/jira/browse/SPARK-16831?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15402031#comment-15402031 ]
Sean Owen commented on SPARK-16831:
-----------------------------------
The Scala version does scale this correctly. Looks like a bug for the Pyspark version, go ahead and fix it.
> CrossValidator reports incorrect avgMetrics
> -------------------------------------------
>
> Key: SPARK-16831
> URL: https://issues.apache.org/jira/browse/SPARK-16831
> Project: Spark
> Issue Type: Bug
> Components: ML, PySpark
> Affects Versions: 2.0.0
> Reporter: Max Moroz
>
> The avgMetrics are summed up across all folds instead of being averaged. This is an easy fix in CrossValidator._fit() function: {code}metrics[j]+=metric{code} should be {code}metrics[j]+=metric/nFolds{code}.
> {code}
> dataset = spark.createDataFrame(
> [(Vectors.dense([0.0]), 0.0),
> (Vectors.dense([0.4]), 1.0),
> (Vectors.dense([0.5]), 0.0),
> (Vectors.dense([0.6]), 1.0),
> (Vectors.dense([1.0]), 1.0)] * 1000,
> ["features", "label"]).cache()
> paramGrid = pyspark.ml.tuning.ParamGridBuilder().build()
> tvs = pyspark.ml.tuning.TrainValidationSplit(estimator=pyspark.ml.regression.LinearRegression(),
> estimatorParamMaps=paramGrid,
> evaluator=pyspark.ml.evaluation.RegressionEvaluator(),
> trainRatio=0.8)
> model = tvs.fit(train)
> print(model.validationMetrics)
> for folds in (3, 5, 10):
> cv = pyspark.ml.tuning.CrossValidator(estimator=pyspark.ml.regression.LinearRegression(),
> estimatorParamMaps=paramGrid,
> evaluator=pyspark.ml.evaluation.RegressionEvaluator(),
> numFolds=folds
> )
> cvModel = cv.fit(dataset)
> print(folds, cvModel.avgMetrics)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org