You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/12 11:59:20 UTC

[jira] [Resolved] (SPARK-15781) Misleading deprecated property in standalone cluster configuration documentation

     [ https://issues.apache.org/jira/browse/SPARK-15781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-15781.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 2.0.0

Issue resolved by pull request 13533
[https://github.com/apache/spark/pull/13533]

> Misleading deprecated property in standalone cluster configuration documentation
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-15781
>                 URL: https://issues.apache.org/jira/browse/SPARK-15781
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 1.6.1
>            Reporter: Jonathan Taws
>            Priority: Minor
>             Fix For: 2.0.0
>
>
> I am unsure if this is regarded as an issue or not, but in the [latest|http://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts] documentation for the configuration to launch Spark in stand-alone cluster mode, the following property is documented :
> |SPARK_WORKER_INSTANCES| 	Number of worker instances to run on each machine (default: 1). You can make this more than 1 if you have have very large machines and would like multiple Spark worker processes. If you do set this, make sure to also set SPARK_WORKER_CORES explicitly to limit the cores per worker, or else each worker will try to use all the cores.| 
> However, once I launch Spark with the spark-submit utility and the property {{SPARK_WORKER_INSTANCES}} set in my spark-env.sh file, I get the following deprecated warning : 
> {code}
> 16/06/06 16:38:28 WARN SparkConf: 
> SPARK_WORKER_INSTANCES was detected (set to '4').
> This is deprecated in Spark 1.0+.
> Please instead use:
>  - ./spark-submit with --num-executors to specify the number of executors
>  - Or set SPARK_EXECUTOR_INSTANCES
>  - spark.executor.instances to configure the number of instances in the spark config.
> {code}
> Is this regarded as normal practice to have deprecated fields documented in the documentation ? 
> I would have preferred to directly know about the --num-executors property than to have to submit my application and find a deprecated warning. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org