You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ZemingZhao (JIRA)" <ji...@apache.org> on 2015/08/20 09:30:45 UTC

[jira] [Updated] (SPARK-10132) daemon crash caused by memory leak

     [ https://issues.apache.org/jira/browse/SPARK-10132?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

ZemingZhao updated SPARK-10132:
-------------------------------
    Attachment: xqjmap.live
                xqjmap.all
                oracle_gclog

attach the gclog and jmap info

> daemon crash caused by memory leak
> ----------------------------------
>
>                 Key: SPARK-10132
>                 URL: https://issues.apache.org/jira/browse/SPARK-10132
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.3.1, 1.4.1
>         Environment: 1. Cluster: 7 Redhat notes cluster, each has 32 cores
> 2. OS type: Red Hat Enterprise Linux Server release 7.1 (Maipo)
> 3. Java version:   tried both Oracle jdk 1.6 and 1.7 
> java version "1.6.0_13"
> Java(TM) SE Runtime Environment (build 1.6.0_13-b03)
> Java HotSpot(TM) 64-Bit Server VM (build 11.3-b02, mixed mode)
> java version "1.7.0"
> Java(TM) SE Runtime Environment (build 1.7.0-b147)
> Java HotSpot(TM) 64-Bit Server VM (build 21.0-b17, mixed mode)
> 4. JVM Option on spark-env.sh, 
> Notes: SPARK_DAEMON_MEMORY was set to 300M to speed up the crash process
> SPARK_DAEMON_JAVA_OPTS="-Xloggc:/root/spark/oracle_gclog"
> SPARK_DAEMON_MEMORY=300m
>            Reporter: ZemingZhao
>            Priority: Critical
>         Attachments: oracle_gclog, xqjmap.all, xqjmap.live
>
>
> constantly submit short batch workload onto spark. 
> spark master and worker will crash casued by memory leak.
> according to the gclog and jmap info, this leak should be related to Akka but cannot find the root cause by now.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org