You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Markus Losoi <ma...@gmail.com> on 2013/10/26 22:19:29 UTC

A quick note about SPARK_WORKER_OPTS and ExecutorRunner

Hi

I was wondering why my SPARK_WORKER_OPTS at conf/spark-env.sh was not passed
to the executors and noticed the following line in ExecutorRunner.scala
(Spark 0.8.0):

116: val workerLocalOpts =
Option(getenv("SPARK_JAVA_OPTS")).map(Utils.splitCommandString).getOrElse(Ni
l)

Is SPARK_JAVA_OPTS supposed to be SPARK_WORKER_OPTS in this line? The next
line adds the options in SPARK_JAVA_OPTS:

117: val userOpts =
getAppEnv("SPARK_JAVA_OPTS").map(Utils.splitCommandString).getOrElse(Nil)

The options in both the variables workerLocalOpts and userOpts are
aggregated into the executor options in the line:

126: Seq("-cp", classPath) ++ libraryOpts ++ workerLocalOpts ++ userOpts ++
memoryOpts

Best regards,
Markus Losoi (markus.losoi@gmail.com)