You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/21 06:11:57 UTC

[jira] [Commented] (SPARK-16083) spark HistoryServer memory increases until gets killed by OS.

    [ https://issues.apache.org/jira/browse/SPARK-16083?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15341182#comment-15341182 ] 

Sean Owen commented on SPARK-16083:
-----------------------------------

The heap dump seems to show a 55MB heap only. I am not sure that's how you set memory settings for the history server, but "history-server" is also not a Spark script. Here, what do you mean by process size? If the OS is killing the process, then it is because the JVM is being allowed to use far too much memory. That wouldn't be a Spark problem.

> spark HistoryServer memory increases until gets killed by OS.
> -------------------------------------------------------------
>
>                 Key: SPARK-16083
>                 URL: https://issues.apache.org/jira/browse/SPARK-16083
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.5.1
>         Environment: RHEL-6, IOP-4.1.0.0, 10 node cluster
>            Reporter: Sudhakar Thota
>         Attachments: 27814.004.000.sparkhistoryservermemory.log.1, 27814004000_spark_heap_2016-06-17_12-05.hprof.gz, spark-spark-org.apache.spark.deploy.history.HistoryServer-1-testbic1on5l.out.4
>
>
> Spark HistoryServer process consuming memory over few days finally getting killed by operating system.  Heapdump analysis of jmap dumps with IBM HeapAnalyzer, found that total heap size is 800M and process size is 11G. HistoryServer is started with 1G. ("history-server -Xms1g -Xmx1g org.apache.spark.deploy.history.HistoryServer")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org