You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ameen Tayyebi (JIRA)" <ji...@apache.org> on 2016/10/04 18:18:20 UTC
[jira] [Issue Comment Deleted] (SPARK-17777) Spark Scheduler Hangs
Indefinitely
[ https://issues.apache.org/jira/browse/SPARK-17777?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ameen Tayyebi updated SPARK-17777:
----------------------------------
Comment: was deleted
(was: Here's repro code:
import org.apache.spark.{Partition, SparkContext, TaskContext}
import org.apache.spark.rdd.RDD
class testRDD(@transient sc: SparkContext)
extends RDD[(String, Int)](sc, Nil)
with Serializable{
override def getPartitions: Array[Partition] = {
sc.parallelize(Seq(("a",1),("b",2))).reduceByKey(_+_).collect()
val result = new Array[Partition](4)
for (i <- 0 until 4) {
result(i) = new Partition {
override def index: Int = 0
}
}
result
}
override def compute(split: Partition, context: TaskContext):
Iterator[(String,Int)] = Seq(("a",3),("b",4)).iterator
}
val y = new testRDD(sc)
y.map(r => r).reduceByKey(_+_).count()
This can be simply pasted in spark-shell.)
> Spark Scheduler Hangs Indefinitely
> ----------------------------------
>
> Key: SPARK-17777
> URL: https://issues.apache.org/jira/browse/SPARK-17777
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.0
> Environment: AWS EMR 4.3, can also be reproduced locally
> Reporter: Ameen Tayyebi
>
> We've identified a problem with Spark scheduling. The issue manifests itself when an RDD calls SparkContext.parallelize within its getPartitions method. This seemingly "recursive" call causes the problem. We have a repro case that can easily be run.
> Please advise on what the issue might be and how we can work around it in the mean time.
> Thanks,
> -Ameen
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org