You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hailong Wen (JIRA)" <ji...@apache.org> on 2015/09/06 04:18:45 UTC
[jira] [Commented] (SPARK-10132) daemon crash caused by memory leak
[ https://issues.apache.org/jira/browse/SPARK-10132?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14732163#comment-14732163 ]
Hailong Wen commented on SPARK-10132:
-------------------------------------
Latest updates:
No matter how big the JVM is, it will finally be used up.
This issue is reported and deemed as a bug in https://github.com/akka/akka/issues/18353
> daemon crash caused by memory leak
> ----------------------------------
>
> Key: SPARK-10132
> URL: https://issues.apache.org/jira/browse/SPARK-10132
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.3.1, 1.4.1
> Environment: 1. Cluster: 7 Redhat notes cluster, each has 32 cores
> 2. OS type: Red Hat Enterprise Linux Server release 7.1 (Maipo)
> 3. Java version: tried both Oracle jdk 1.6 and 1.7
> java version "1.6.0_13"
> Java(TM) SE Runtime Environment (build 1.6.0_13-b03)
> Java HotSpot(TM) 64-Bit Server VM (build 11.3-b02, mixed mode)
> java version "1.7.0"
> Java(TM) SE Runtime Environment (build 1.7.0-b147)
> Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)
> 4. JVM Option on spark-env.sh,
> Notes: SPARK_DAEMON_MEMORY was set to 300M to speed up the crash process
> SPARK_DAEMON_JAVA_OPTS="-Xloggc:/root/spark/oracle_gclog"
> SPARK_DAEMON_MEMORY=300m
> Reporter: ZemingZhao
> Priority: Critical
> Attachments: oracle_gclog, xqjmap.all, xqjmap.live
>
>
> constantly submit short batch workload onto spark.
> spark master and worker will crash casued by memory leak.
> according to the gclog and jmap info, this leak should be related to Akka but cannot find the root cause by now.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org