You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2018/01/12 14:29:50 UTC

spark git commit: Update rdd-programming-guide.md

Repository: spark
Updated Branches:
  refs/heads/master 505086806 -> f5300fbbe


Update rdd-programming-guide.md

## What changes were proposed in this pull request?

Small typing correction - double word

## How was this patch tested?

Please review http://spark.apache.org/contributing.html before opening a pull request.

Author: Matthias Beaupère <ma...@gmail.com>

Closes #20212 from matthiasbe/patch-1.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f5300fbb
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f5300fbb
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f5300fbb

Branch: refs/heads/master
Commit: f5300fbbe370af3741560f67bfb5ae6f0b0f7bb5
Parents: 5050868
Author: Matthias Beaupère <ma...@gmail.com>
Authored: Fri Jan 12 08:29:46 2018 -0600
Committer: Sean Owen <so...@cloudera.com>
Committed: Fri Jan 12 08:29:46 2018 -0600

----------------------------------------------------------------------
 docs/rdd-programming-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f5300fbb/docs/rdd-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/rdd-programming-guide.md b/docs/rdd-programming-guide.md
index 29af159..2e29aef 100644
--- a/docs/rdd-programming-guide.md
+++ b/docs/rdd-programming-guide.md
@@ -91,7 +91,7 @@ so C libraries like NumPy can be used. It also works with PyPy 2.3+.
 
 Python 2.6 support was removed in Spark 2.2.0.
 
-Spark applications in Python can either be run with the `bin/spark-submit` script which includes Spark at runtime, or by including including it in your setup.py as:
+Spark applications in Python can either be run with the `bin/spark-submit` script which includes Spark at runtime, or by including it in your setup.py as:
 
 {% highlight python %}
     install_requires=[


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org