You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "dzcxzl (JIRA)" <ji...@apache.org> on 2018/07/10 12:35:00 UTC
[jira] [Updated] (SPARK-24677) Avoid NoSuchElementException from
MedianHeap
[ https://issues.apache.org/jira/browse/SPARK-24677?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
dzcxzl updated SPARK-24677:
---------------------------
Summary: Avoid NoSuchElementException from MedianHeap (was: MedianHeap is empty when speculation is enabled, causing the SparkContext to stop)
> Avoid NoSuchElementException from MedianHeap
> --------------------------------------------
>
> Key: SPARK-24677
> URL: https://issues.apache.org/jira/browse/SPARK-24677
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.3.1
> Reporter: dzcxzl
> Priority: Critical
>
> When introducing SPARK-23433 , maybe cause stop sparkcontext.
> {code:java}
> ERROR Utils: uncaught error in thread task-scheduler-speculation, stopping SparkContext
> java.util.NoSuchElementException: MedianHeap is empty.
> at org.apache.spark.util.collection.MedianHeap.median(MedianHeap.scala:83)
> at org.apache.spark.scheduler.TaskSetManager.checkSpeculatableTasks(TaskSetManager.scala:968)
> at org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:94)
> at org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:93)
> at scala.collection.Iterator$class.foreach(Iterator.scala:742)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at org.apache.spark.scheduler.Pool.checkSpeculatableTasks(Pool.scala:93)
> at org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:94)
> at org.apache.spark.scheduler.Pool$$anonfun$checkSpeculatableTasks$1.apply(Pool.scala:93)
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org