You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/10 04:09:11 UTC
[jira] [Commented] (SPARK-10839) SPARK_DAEMON_MEMORY has effect on
heap size of thriftserver
[ https://issues.apache.org/jira/browse/SPARK-10839?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15049940#comment-15049940 ]
Apache Spark commented on SPARK-10839:
--------------------------------------
User 'ghost' has created a pull request for this issue:
https://github.com/apache/spark/pull/8921
> SPARK_DAEMON_MEMORY has effect on heap size of thriftserver
> -----------------------------------------------------------
>
> Key: SPARK-10839
> URL: https://issues.apache.org/jira/browse/SPARK-10839
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit
> Affects Versions: 1.4.1, 1.5.0
> Reporter: Yun Zhao
>
> When SPARK_DAEMON_MEMORY in spark-env.sh is setted to modify memory of Master or Worker, there's an effect on heap size of thriftserver, further, this effect cannot be modified by spark.driver.memory or --driver-memory. Version 1.3.1 does not have the same problem.
> in org.apache.spark.launcher.SparkSubmitCommandBuilder:
> {quote}
> String tsMemory =
> isThriftServer(mainClass) ? System.getenv("SPARK_DAEMON_MEMORY") : null;
> String memory = firstNonEmpty(tsMemory,
> firstNonEmptyValue(SparkLauncher.DRIVER_MEMORY, conf, props),
> System.getenv("SPARK_DRIVER_MEMORY"), System.getenv("SPARK_MEM"), DEFAULT_MEM);
> cmd.add("-Xms" + memory);
> cmd.add("-Xmx" + memory);
> {quote}
> SPARK_DAEMON_MEMORY has the highest priority.
> It can be modified like this:
> {quote}
> String memory = firstNonEmpty(firstNonEmptyValue(SparkLauncher.DRIVER_MEMORY, conf, props),
> System.getenv("SPARK_DRIVER_MEMORY"), tsMemory, System.getenv("SPARK_MEM"), DEFAULT_MEM);
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org