You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2016/05/26 03:01:12 UTC

[jira] [Commented] (AMBARI-16757) Spark History Server heap size is not exposed (History Server crashed with OOM)

    [ https://issues.apache.org/jira/browse/AMBARI-16757?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15301397#comment-15301397 ] 

Hudson commented on AMBARI-16757:
---------------------------------

FAILURE: Integrated in Ambari-trunk-Commit #4926 (See [https://builds.apache.org/job/Ambari-trunk-Commit/4926/])
AMBARI-16757. Spark History Server heap size is not exposed (History (sgunturi: [http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=8059a6392747292fd2b180577cd5cd93a9392b57])
* ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml


> Spark History Server heap size is not exposed (History Server crashed with OOM)
> -------------------------------------------------------------------------------
>
>                 Key: AMBARI-16757
>                 URL: https://issues.apache.org/jira/browse/AMBARI-16757
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.0.0
>            Reporter: Weiqing Yang
>            Priority: Minor
>             Fix For: 2.4.0
>
>         Attachments: AMBARI-16757-1.patch, AMBARI-16757-2.patch, AMBARI-16757-3.patch
>
>
> Ambari is not exposing the heap size parameter for Spark History Server.
> The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for example.
> The newer versions of Spark defaults this to 1g, but on the older versions, it was defaulting to 512m it seems, and it was causing OOM.
> So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template (default: 1G).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)