You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2016/10/22 08:39:15 UTC

spark git commit: [SPARK-17898][DOCS] repositories needs username and password

Repository: spark
Updated Branches:
  refs/heads/master 625fdddac -> 01b26a064


[SPARK-17898][DOCS] repositories needs username and password

## What changes were proposed in this pull request?

Document `user:password` syntax as possible means of specifying credentials for password-protected `--repositories`

## How was this patch tested?

Doc build

Author: Sean Owen <so...@cloudera.com>

Closes #15584 from srowen/SPARK-17898.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/01b26a06
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/01b26a06
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/01b26a06

Branch: refs/heads/master
Commit: 01b26a06436b4c8020f22be3e1da4995b44c9b03
Parents: 625fddd
Author: Sean Owen <so...@cloudera.com>
Authored: Sat Oct 22 09:39:07 2016 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Sat Oct 22 09:39:07 2016 +0100

----------------------------------------------------------------------
 docs/programming-guide.md       | 8 ++++----
 docs/submitting-applications.md | 2 ++
 2 files changed, 6 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/01b26a06/docs/programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/programming-guide.md b/docs/programming-guide.md
index 20b4bee..7516579 100644
--- a/docs/programming-guide.md
+++ b/docs/programming-guide.md
@@ -182,7 +182,7 @@ variable called `sc`. Making your own SparkContext will not work. You can set wh
 context connects to using the `--master` argument, and you can add JARs to the classpath
 by passing a comma-separated list to the `--jars` argument. You can also add dependencies
 (e.g. Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates
-to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. SonaType)
+to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. Sonatype)
 can be passed to the `--repositories` argument. For example, to run `bin/spark-shell` on exactly
 four cores, use:
 
@@ -214,9 +214,9 @@ variable called `sc`. Making your own SparkContext will not work. You can set wh
 context connects to using the `--master` argument, and you can add Python .zip, .egg or .py files
 to the runtime path by passing a comma-separated list to `--py-files`. You can also add dependencies
 (e.g. Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates
-to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. SonaType)
-can be passed to the `--repositories` argument. Any python dependencies a Spark Package has (listed in
-the requirements.txt of that package) must be manually installed using pip when necessary.
+to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. Sonatype)
+can be passed to the `--repositories` argument. Any Python dependencies a Spark package has (listed in
+the requirements.txt of that package) must be manually installed using `pip` when necessary.
 For example, to run `bin/pyspark` on exactly four cores, use:
 
 {% highlight bash %}

http://git-wip-us.apache.org/repos/asf/spark/blob/01b26a06/docs/submitting-applications.md
----------------------------------------------------------------------
diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md
index 6fe3049..b738194 100644
--- a/docs/submitting-applications.md
+++ b/docs/submitting-applications.md
@@ -190,6 +190,8 @@ is handled automatically, and with Spark standalone, automatic cleanup can be co
 Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates
 with `--packages`. All transitive dependencies will be handled when using this command. Additional
 repositories (or resolvers in SBT) can be added in a comma-delimited fashion with the flag `--repositories`.
+(Note that credentials for password-protected repositories can be supplied in some cases in the repository URI,
+such as in `https://user:password@host/...`. Be careful when supplying credentials this way.)
 These commands can be used with `pyspark`, `spark-shell`, and `spark-submit` to include Spark Packages.
 
 For Python, the equivalent `--py-files` option can be used to distribute `.egg`, `.zip` and `.py` libraries


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org