You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2019/03/25 04:56:05 UTC

[spark] branch master updated: [SPARK-27261][DOC] Improve app submission doc for passing multiple configs

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8ec6cb6  [SPARK-27261][DOC] Improve app submission doc for passing multiple configs
8ec6cb6 is described below

commit 8ec6cb67c71c67788230cab8a0cd34d8ad3ce24b
Author: s71955 <su...@gmail.com>
AuthorDate: Sun Mar 24 21:55:48 2019 -0700

    [SPARK-27261][DOC] Improve app submission doc for passing multiple configs
    
    ## What changes were proposed in this pull request?
    
    While submitting the spark application, passing multiple configurations not documented clearly, no examples given.it will be better if it can be documented since  clarity is less from spark documentation side.
    Even when i was browsing i could see few queries raised by users, below provided the reference.
    
    https://community.hortonworks.com/questions/105022/spark-submit-multiple-configurations.html
    
     As part of fixing i had documented the above scenario  with an example.
    
    ## How was this patch tested?
    Manual inspection of the updated document.
    
    Closes #24191 from sujith71955/master_conf.
    
    Authored-by: s71955 <su...@gmail.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 docs/submitting-applications.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md
index 77aa083..d6b663e 100644
--- a/docs/submitting-applications.md
+++ b/docs/submitting-applications.md
@@ -44,7 +44,7 @@ Some of the commonly used options are:
 * `--class`: The entry point for your application (e.g. `org.apache.spark.examples.SparkPi`)
 * `--master`: The [master URL](#master-urls) for the cluster (e.g. `spark://23.195.26.187:7077`)
 * `--deploy-mode`: Whether to deploy your driver on the worker nodes (`cluster`) or locally as an external client (`client`) (default: `client`) <b> &#8224; </b>
-* `--conf`: Arbitrary Spark configuration property in key=value format. For values that contain spaces wrap "key=value" in quotes (as shown).
+* `--conf`: Arbitrary Spark configuration property in key=value format. For values that contain spaces wrap "key=value" in quotes (as shown). Multiple configurations should be passed as separate arguments. (e.g. `--conf <key>=<value> --conf <key2>=<value2>`)
 * `application-jar`: Path to a bundled jar including your application and all dependencies. The URL must be globally visible inside of your cluster, for instance, an `hdfs://` path or a `file://` path that is present on all nodes.
 * `application-arguments`: Arguments passed to the main method of your main class, if any
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org