You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by MEETHU MATHEW <me...@yahoo.co.in> on 2014/07/23 10:04:10 UTC
Use of SPARK_DAEMON_JAVA_OPTS
Hi all,
Sorry for taking this topic again,still I am confused on this.
I set SPARK_DAEMON_JAVA_OPTS="-XX:+UseCompressedOops -Xmx8g"
when I run my application,I got the following line in logs.
Spark Command: java -cp ::/usr/local/spark-1.0.1/conf:/usr/local/spark-1.0.1/assembly/target/scala-2.10/spark-assembly-1.0.1-hadoop1.2.1.jar -XX:MaxPermSize=128m -XX:+UseCompressedOops-Xmx8g-Dspark.akka.logLifecycleEvents=true -Xms512m
-Xmx512morg.apache.spark.deploy.worker.Worker spark://master:7077
-Xmx is set twice. One from the SPARK_DAEMON_JAVA_OPTS .
2nd from bin/spark-class(from SPARK_DAEMON_MEMORY or DEFAULT_MEM).
I believe that the second value will be taken in execution ie the one passed as SPARK_DAEMON _MEMORY or DEFAULT_MEM.
So I would like to know what is the purpose of SPARK_DAEMON_JAVA_OPTS and how it is different from SPARK_DAEMON _MEMORY.
Thanks & Regards,
Meethu M
Re: Use of SPARK_DAEMON_JAVA_OPTS
Posted by Andrew Or <an...@databricks.com>.
Hi Meethu,
SPARK_DAEMON_JAVA_OPTS is not intended for setting memory. Please use
SPARK_DAEMON_MEMORY instead. It turns out that java respects only the last
-Xms and -Xmx values, and in spark-class we put SPARK_DAEMON_JAVA_OPTS
before the SPARK_DAEMON_MEMORY. In general, memory configuration in spark
should not be done through any config or environment variable that
references "java opts".
Andrew
2014-07-23 1:04 GMT-07:00 MEETHU MATHEW <me...@yahoo.co.in>:
>
> Hi all,
>
> Sorry for taking this topic again,still I am confused on this.
>
> I set SPARK_DAEMON_JAVA_OPTS="-XX:+UseCompressedOops -Xmx8g"
>
> when I run my application,I got the following line in logs.
>
> Spark Command: java -cp
> ::/usr/local/spark-1.0.1/conf:/usr/local/spark-1.0.1/assembly/target/scala-2.10/spark-assembly-1.0.1-hadoop1.2.1.jar
> -XX:MaxPermSize=128m -XX:+UseCompressedOops -Xmx8g -Dspark.akka.logLifecycleEvents=true
> -Xms512m
> -Xmx512m org.apache.spark.deploy.worker.Worker spark://master:7077
>
>
> -Xmx is set twice. One from the SPARK_DAEMON_JAVA_OPTS .
> 2nd from bin/spark-class(from SPARK_DAEMON_MEMORY or DEFAULT_MEM).
>
> I believe that the second value will be taken in execution ie the one
> passed as SPARK_DAEMON _MEMORY or DEFAULT_MEM.
>
> So I would like to know what is the purpose of SPARK_DAEMON_JAVA_OPTS and
> how it is different from SPARK_DAEMON _MEMORY.
>
> Thanks & Regards,
> Meethu M
>