You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "huangweiyi (Jira)" <ji...@apache.org> on 2019/12/13 04:49:00 UTC
[jira] [Updated] (SPARK-30246) Spark on Yarn External Shuffle
Service Memory Leak
[ https://issues.apache.org/jira/browse/SPARK-30246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
huangweiyi updated SPARK-30246:
-------------------------------
Attachment: nm_oom.png
> Spark on Yarn External Shuffle Service Memory Leak
> --------------------------------------------------
>
> Key: SPARK-30246
> URL: https://issues.apache.org/jira/browse/SPARK-30246
> Project: Spark
> Issue Type: Bug
> Components: Shuffle, Spark Core
> Affects Versions: 2.4.3
> Environment: hadoop 2.7.3
> spark 2.4.3
> jdk 1.8.0_60
> Reporter: huangweiyi
> Priority: Major
> Attachments: nm_oom.png
>
>
> In our large busy yarn cluster which started Spark external shuffle service on each NodeManager(NM), we encountered OOM in some NMs.
> after i dump the heap memory and found there are some stremState objects still in heap, but the app which the streamState belongs to is already finished.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org