You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by vonnagy <iv...@vadio.com> on 2016/10/06 16:21:09 UTC
Submit job with driver options in Mesos Cluster mode
I am trying to submit a job to spark running in a Mesos cluster. We need to
pass custom java options to the driver and executor for configuration, but
the driver task never includes the options. Here is an example submit.
GC_OPTS="-XX:+UseConcMarkSweepGC
-verbose:gc -XX:+PrintGCTimeStamps -Xloggc:$appdir/gc.out
-XX:MaxPermSize=512m
-XX:+CMSClassUnloadingEnabled "
EXEC_PARAMS="-Dloglevel=DEBUG -Dkafka.broker-address=${KAFKA_ADDRESS}
-Dredis.master=${REDIS_MASTER} -Dredis.port=${REDIS_PORT}
spark-submit \
--name client-events-intake \
--class ClientEventsApp \
--deploy-mode cluster \
--driver-java-options "${EXEC_PARAMS} ${GC_OPTS}" \
--conf "spark.ui.killEnabled=true" \
--conf "spark.mesos.coarse=true" \
--conf "spark.driver.extraJavaOptions=${EXEC_PARAMS}" \
--conf "spark.executor.extraJavaOptions=${EXEC_PARAMS}" \
--master mesos://someip:7077 \
--verbose \
some.jar
When the driver task runs in Mesos it is creating the following command:
sh -c 'cd spark-1*; bin/spark-submit --name client-events-intake --class
ClientEventsApp --master mesos://someip:5050 --driver-cores 1.0
--driver-memory 512M ../some.jar '
There are no options for the driver here, thus the driver app blows up
because it can't find the java options. However, the environment variables
contain the executor options:
SPARK_EXECUTOR_OPTS -> -Dspark.executor.extraJavaOptions=-Dloglevel=DEBUG
...
Any help would be great. I know that we can set some "spark.*" settings in
default configs, but these are not necessarily spark related. This is not an
issue when running the same logic outside of a Mesos cluster in Spark
standalone mode.
Thanks!
--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Submit-job-with-driver-options-in-Mesos-Cluster-mode-tp19265.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org