You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/08/22 01:33:46 UTC
[jira] [Assigned] (SPARK-10164) GMM bug: match error
[ https://issues.apache.org/jira/browse/SPARK-10164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-10164:
------------------------------------
Assignee: Apache Spark (was: Joseph K. Bradley)
> GMM bug: match error
> --------------------
>
> Key: SPARK-10164
> URL: https://issues.apache.org/jira/browse/SPARK-10164
> Project: Spark
> Issue Type: Bug
> Components: MLlib
> Affects Versions: 1.5.0
> Reporter: Joseph K. Bradley
> Assignee: Apache Spark
> Priority: Critical
>
> GaussianMixture now distributes matrix decompositions for certain problem sizes. Distributed computation actually fails, but this was not tested in unit tests. This is a regression.
> Here is an example failure:
> {code}
> Exception in thread "main" scala.MatchError: ArrayBuffer(0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000
> 001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.050
> 00000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001, 0.0500000000000
> 0001, 0.05000000000000001, 0.05000000000000001, 0.05000000000000001) (of class scala.collection.mutable.ArrayBuffer)
> at scala.runtime.ScalaRunTime$.array_apply(ScalaRunTime.scala:71)
> at scala.Array$.slowcopy(Array.scala:81)
> at scala.Array$.copy(Array.scala:107)
> at org.apache.spark.mllib.clustering.GaussianMixture.run(GaussianMixture.scala:215)
> at mllib.perf.clustering.GaussianMixtureTest.run(GaussianMixtureTest.scala:60)
> at mllib.perf.TestRunner$$anonfun$2.apply(TestRunner.scala:66)
> at mllib.perf.TestRunner$$anonfun$2.apply(TestRunner.scala:64)
> at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> at scala.collection.immutable.Range.foreach(Range.scala:141)
> at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> at scala.collection.AbstractTraversable.map(Traversable.scala:105)
> at mllib.perf.TestRunner$.main(TestRunner.scala:64)
> at mllib.perf.TestRunner.main(TestRunner.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 15/08/21 21:25:33 INFO spark.SparkContext: Invoking stop() from shutdown hook
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org