You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2019/04/25 19:07:00 UTC
[jira] [Assigned] (SPARK-27434) memory leak in spark driver
[ https://issues.apache.org/jira/browse/SPARK-27434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-27434:
------------------------------------
Assignee: Apache Spark
> memory leak in spark driver
> ---------------------------
>
> Key: SPARK-27434
> URL: https://issues.apache.org/jira/browse/SPARK-27434
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.4.0
> Environment: OS: Centos 7
> JVM:
> **_openjdk version "1.8.0_201"_
> _OpenJDK Runtime Environment (IcedTea 3.11.0) (Alpine 8.201.08-r0)_
> _OpenJDK 64-Bit Server VM (build 25.201-b08, mixed mode)_
> Spark version: 2.4.0
> Reporter: Ryne Yang
> Assignee: Apache Spark
> Priority: Major
> Attachments: Screen Shot 2019-04-10 at 12.11.35 PM.png
>
>
> we got a OOM exception on the driver after driver has completed multiple jobs(we are reusing spark context).
> so we took a heap dump and looked at the leak analysis, found out that under AsyncEventQueue there are 3.5GB of heap allocated. Possibly a leak.
>
> can someone take a look at?
> here is the heap analysis:
> !Screen Shot 2019-04-10 at 12.11.35 PM.png!
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org