You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Weiqing Yang (JIRA)" <ji...@apache.org> on 2016/05/19 05:31:12 UTC
[jira] [Created] (AMBARI-16757) Spark History Server heap size is
not exposed (History Server crashed with OOM)
Weiqing Yang created AMBARI-16757:
-------------------------------------
Summary: Spark History Server heap size is not exposed (History Server crashed with OOM)
Key: AMBARI-16757
URL: https://issues.apache.org/jira/browse/AMBARI-16757
Project: Ambari
Issue Type: Bug
Components: ambari-server
Affects Versions: 2.0.0
Reporter: Weiqing Yang
Priority: Critical
Fix For: 2.4.0
Ambari is not exposing the heap size parameter for Spark History Server.
The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for example.
The newer versions of Spark defaults this to 1g, but on the older versions, it was defaulting to 512m it seems, and it was causing OOM.
So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template (default: 1G).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)