You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "KaiXu (JIRA)" <ji...@apache.org> on 2016/11/15 08:47:58 UTC

[jira] [Updated] (SPARK-18443) spark leak memeory and led to OOM

     [ https://issues.apache.org/jira/browse/SPARK-18443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

KaiXu updated SPARK-18443:
--------------------------
    Environment: 
CentOS7.2 kernel: 3.10.0-327.el7.x86_64
Hadoop2.7.1
Spark1.6.2 release version
Intel(R) Xeon(R) CPU E5-2699 v4 @ 2.20GHz
384GB memory

> spark leak memeory and led to OOM
> ---------------------------------
>
>                 Key: SPARK-18443
>                 URL: https://issues.apache.org/jira/browse/SPARK-18443
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.2
>         Environment: CentOS7.2 kernel: 3.10.0-327.el7.x86_64
> Hadoop2.7.1
> Spark1.6.2 release version
> Intel(R) Xeon(R) CPU E5-2699 v4 @ 2.20GHz
> 384GB memory
>            Reporter: KaiXu
>
> task failed with below log:
> java.lang.OutOfMemoryError: Unable to acquire 35 bytes of memory, got 0
> 	at org.apache.spark.memory.MemoryConsumer.allocatePage(MemoryConsumer.java:120)
> 	at org.apache.spark.shuffle.sort.ShuffleExternalSorter.acquireNewPageIfNecessary(ShuffleExternalSorter.java:359)
> 	at org.apache.spark.shuffle.sort.ShuffleExternalSorter.insertRecord(ShuffleExternalSorter.java:380)
> 	at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.insertRecordIntoSorter(UnsafeShuffleWriter.java:237)
> 	at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:164)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> executor log:
> 16/11/15 16:55:28 INFO spark.SparkRecordHandler: processing 100 rows: used memory = 10281746248
> 16/11/15 16:55:28 INFO spark.SparkRecordHandler: processing 1000 rows: used memory = 10281746248
> 16/11/15 16:55:28 INFO spark.SparkRecordHandler: processing 10000 rows: used memory = 10287819016
> 16/11/15 16:55:29 INFO exec.ReduceSinkOperator: keys are [reducesinkkey0] num distributions: 1
> 16/11/15 16:55:29 INFO exec.ReduceSinkOperator: RS[11]: records written - 1
> 16/11/15 16:55:29 INFO exec.ReduceSinkOperator: RS[11]: records written - 10
> 16/11/15 16:55:29 INFO exec.ReduceSinkOperator: RS[11]: records written - 100
> 16/11/15 16:55:29 INFO memory.TaskMemoryManager: Memory used in task 1250
> 16/11/15 16:55:29 INFO memory.TaskMemoryManager: Acquired by org.apache.spark.shuffle.sort.ShuffleExternalSorter@10651e73: 32.0 KB
> 16/11/15 16:55:29 INFO memory.TaskMemoryManager: 1755718878 bytes of memory were used by task 1250 but are not associated with specific consumers
> 16/11/15 16:55:29 INFO memory.TaskMemoryManager: 9970419198 bytes of memory are used for execution and 1802593 bytes of memory are used for storage
> 16/11/15 16:55:29 ERROR executor.Executor: Managed memory leak detected; size = 1755718878 bytes, TID = 1250
> 16/11/15 16:55:29 ERROR executor.Executor: Exception in task 372.0 in stage 1.0 (TID 1250)
> java.lang.OutOfMemoryError: Unable to acquire 35 bytes of memory, got 0
> 	at org.apache.spark.memory.MemoryConsumer.allocatePage(MemoryConsumer.java:120)
> 	at org.apache.spark.shuffle.sort.ShuffleExternalSorter.acquireNewPageIfNecessary(ShuffleExternalSorter.java:359)
> 	at org.apache.spark.shuffle.sort.ShuffleExternalSorter.insertRecord(ShuffleExternalSorter.java:380)
> 	at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.insertRecordIntoSorter(UnsafeShuffleWriter.java:237)
> 	at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:164)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> 16/11/15 16:55:29 ERROR util.SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[Executor task launch worker-5,5,main]
> java.lang.OutOfMemoryError: Unable to acquire 35 bytes of memory, got 0
> 	at org.apache.spark.memory.MemoryConsumer.allocatePage(MemoryConsumer.java:120)
> 	at org.apache.spark.shuffle.sort.ShuffleExternalSorter.acquireNewPageIfNecessary(ShuffleExternalSorter.java:359)
> 	at org.apache.spark.shuffle.sort.ShuffleExternalSorter.insertRecord(ShuffleExternalSorter.java:380)
> 	at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.insertRecordIntoSorter(UnsafeShuffleWriter.java:237)
> 	at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:164)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> 16/11/15 16:55:29 INFO storage.DiskBlockManager: Shutdown hook called



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org