You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/28 00:55:38 UTC

[jira] [Resolved] (SPARK-7103) SparkContext.union crashed when some RDDs have no partitioner

     [ https://issues.apache.org/jira/browse/SPARK-7103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-7103.
------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0
                   1.3.2

Issue resolved by pull request 5679
[https://github.com/apache/spark/pull/5679]

> SparkContext.union crashed when some RDDs have no partitioner
> -------------------------------------------------------------
>
>                 Key: SPARK-7103
>                 URL: https://issues.apache.org/jira/browse/SPARK-7103
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.0, 1.3.1
>            Reporter: Steven She
>            Priority: Critical
>             Fix For: 1.3.2, 1.4.0
>
>
> I encountered a bug where Spark crashes with the following stack trace:
> {noformat}
> java.util.NoSuchElementException: None.get
> 	at scala.None$.get(Option.scala:313)
> 	at scala.None$.get(Option.scala:311)
> 	at org.apache.spark.rdd.PartitionerAwareUnionRDD.getPartitions(PartitionerAwareUnionRDD.scala:69)
> {noformat}
> Here's a minimal example that reproduces it on the Spark shell:
> {noformat}
> val x = sc.parallelize(Seq(1->true,2->true,3->false)).partitionBy(new HashPartitioner(1))
> val y = sc.parallelize(Seq(1->true))
> sc.union(y, x).count() // crashes
> sc.union(x, y).count() // This works since the first RDD has a partitioner
> {noformat}
> We had to resort to instantiating the UnionRDD directly to avoid the PartitionerAwareUnionRDD.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org