You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Carson Wang (JIRA)" <ji...@apache.org> on 2015/08/11 03:47:46 UTC

[jira] [Created] (SPARK-9809) Task crashes because the internal accumulators are not properly initialized

Carson Wang created SPARK-9809:
----------------------------------

             Summary: Task crashes because the internal accumulators are not properly initialized
                 Key: SPARK-9809
                 URL: https://issues.apache.org/jira/browse/SPARK-9809
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.5.0
            Reporter: Carson Wang
            Priority: Blocker


Job aborted due to stage failure: Task 4 in stage 12.0 failed 4 times, most recent failure: Lost task 4.3 in stage 12.0 (TID 4460, 1
0.1.2.40): java.util.NoSuchElementException: key not found: peakExecutionMemory
        at scala.collection.MapLike$class.default(MapLike.scala:228)
        at scala.collection.AbstractMap.default(Map.scala:58)
        at scala.collection.MapLike$class.apply(MapLike.scala:141)
        at scala.collection.AbstractMap.apply(Map.scala:58)
        at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:699)
        at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:80)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
        at java.lang.Thread.run(Thread.java:722)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org