You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by dgoldenberg <dg...@gmail.com> on 2015/07/14 01:07:08 UTC

How to set the heap size on consumers?

Hi,

I'm seeing quite a bit of information on Spark memory management. I'm just
trying to set the heap size, e.g. Xms as 512m and Xmx as 1g or some such.

Per
http://apache-spark-user-list.1001560.n3.nabble.com/Use-of-SPARK-DAEMON-JAVA-OPTS-tt10479.html#a10529:

"SPARK_DAEMON_JAVA_OPTS is not intended for setting memory. Please use
SPARK_DAEMON_MEMORY instead. It turns out that java respects only the last
-Xms and -Xmx values, and in spark-class we put SPARK_DAEMON_JAVA_OPTS
before the SPARK_DAEMON_MEMORY. In general, memory configuration in spark
should not be done through any config or environment variable that
references "java opts"."

Is this still applicable for Spark 1.3/1.4?

Are there any plans to tackle
https://issues.apache.org/jira/browse/SPARK-1264 ?

Thanks.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-the-heap-size-on-consumers-tp23810.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org