You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kanwaljit Singh (JIRA)" <ji...@apache.org> on 2015/01/07 11:34:34 UTC

[jira] [Closed] (SPARK-2641) Spark submit doesn't pick up executor instances from properties file

     [ https://issues.apache.org/jira/browse/SPARK-2641?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kanwaljit Singh closed SPARK-2641.
----------------------------------

> Spark submit doesn't pick up executor instances from properties file
> --------------------------------------------------------------------
>
>                 Key: SPARK-2641
>                 URL: https://issues.apache.org/jira/browse/SPARK-2641
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Kanwaljit Singh
>
> When running spark-submit in Yarn cluster mode, we provide properties file using --properties-file option.
> spark.executor.instances=5
> spark.executor.memory=2120m
> spark.executor.cores=3
> The submitted job picks up the cores and memory, but not the correct instances.
> I think the issue is here in org.apache.spark.deploy.SparkSubmitArguments:
> // Use properties file as fallback for values which have a direct analog to
>     // arguments in this script.
>     master = Option(master).getOrElse(defaultProperties.get("spark.master").orNull)
>     executorMemory = Option(executorMemory)
>       .getOrElse(defaultProperties.get("spark.executor.memory").orNull)
>     executorCores = Option(executorCores)
>       .getOrElse(defaultProperties.get("spark.executor.cores").orNull)
>     totalExecutorCores = Option(totalExecutorCores)
>       .getOrElse(defaultProperties.get("spark.cores.max").orNull)
>     name = Option(name).getOrElse(defaultProperties.get("spark.app.name").orNull)
>     jars = Option(jars).getOrElse(defaultProperties.get("spark.jars").orNull)
> Along with these defaults, we should also set default for instances:
> numExecutors=Option(numExecutors).getOrElse(defaultProperties.get("spark.executor.instances").orNull)
> PS: spark.executor.instances is also not mentioned on http://spark.apache.org/docs/latest/configuration.html



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org