You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2015/07/01 20:23:06 UTC
[jira] [Created] (SPARK-8765) Flaky PySpark
PowerIterationClustering test
Joseph K. Bradley created SPARK-8765:
----------------------------------------
Summary: Flaky PySpark PowerIterationClustering test
Key: SPARK-8765
URL: https://issues.apache.org/jira/browse/SPARK-8765
Project: Spark
Issue Type: Test
Components: MLlib, PySpark
Reporter: Joseph K. Bradley
Priority: Critical
See failure: [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/36133/console]
{code}
**********************************************************************
File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/mllib/clustering.py", line 291, in __main__.PowerIterationClusteringModel
Failed example:
sorted(model.assignments().collect())
Expected:
[Assignment(id=0, cluster=1), Assignment(id=1, cluster=0), ...
Got:
[Assignment(id=0, cluster=1), Assignment(id=1, cluster=1), Assignment(id=2, cluster=1), Assignment(id=3, cluster=1), Assignment(id=4, cluster=0)]
**********************************************************************
File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/mllib/clustering.py", line 299, in __main__.PowerIterationClusteringModel
Failed example:
sorted(sameModel.assignments().collect())
Expected:
[Assignment(id=0, cluster=1), Assignment(id=1, cluster=0), ...
Got:
[Assignment(id=0, cluster=1), Assignment(id=1, cluster=1), Assignment(id=2, cluster=1), Assignment(id=3, cluster=1), Assignment(id=4, cluster=0)]
**********************************************************************
2 of 13 in __main__.PowerIterationClusteringModel
***Test Failed*** 2 failures.
Had test failures in pyspark.mllib.clustering with python2.6; see logs.
{code}
CC: [~mengxr] [~yanboliang]
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org