You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by akhandeshi <am...@gmail.com> on 2014/11/06 19:50:09 UTC

SparkSubmitDriverBootstrapper and JVM parameters

/usr/lib/jvm/java-1.7.0-openjdk-amd64/bin/java
org.apache.spark.deploy.SparkSubmitDriverBootstrapper

When I  execute /usr/local/spark-1.1.0/bin/spark-submit local[32] for my
app, I see two processes get spun off.  One is the 
org.apache.spark.deploy.SparkSubmitDriverBootstrapper and
org.apache.spark.deploy.SparkSubmit. My understanding is first one is the
driver and the latter is the executor, can you confirm?  If that is true, my 
spark

my application defaults don't seem to be picked-up from the following
parmeters.  My SparkSubmit picks up JVM parameters from here.

spark-defaults.conf
spark.daemon.memory=45g
spark.driver.memory=45g
spark.executor.memory=45g

It is not clear to me, when spark uses spark-defaults? and when spark-env? 
Can some one help me understand.  

spark-env.sh
SPARK_DAEMON_MEMORY=30g
SPARK_EXECUTOR_MEMORY=30g
SPARK_DRIVER_MEMORY=30g

I am running into GC/OOM issues, and I am wondering whether tweaking
SparkSubmitDriverBootstrapper or  SparkSubmit JVM parameter will help.  I
did look at the configuration on Spark's site, and tried many different
approaches as suggested there.

Thanks,
Ami



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSubmitDriverBootstrapper-and-JVM-parameters-tp18290.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org