You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2019/03/02 05:24:17 UTC

[spark] branch master updated: [SPARK-26807][DOCS] Clarify that Pyspark is on PyPi now

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new a97a19d  [SPARK-26807][DOCS] Clarify that Pyspark is on PyPi now
a97a19d is described below

commit a97a19dd9342a651be9551025029788876e9d1d3
Author: Sean Owen <se...@databricks.com>
AuthorDate: Sat Mar 2 14:23:53 2019 +0900

    [SPARK-26807][DOCS] Clarify that Pyspark is on PyPi now
    
    ## What changes were proposed in this pull request?
    
    Docs still say that Spark will be available on PyPi "in the future"; just needs to be updated.
    
    ## How was this patch tested?
    
    Doc build
    
    Closes #23933 from srowen/SPARK-26807.
    
    Authored-by: Sean Owen <se...@databricks.com>
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
 docs/index.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/index.md b/docs/index.md
index 8864239..a85dd9e 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -20,7 +20,7 @@ Please see [Spark Security](security.html) before downloading and running Spark.
 Get Spark from the [downloads page](https://spark.apache.org/downloads.html) of the project website. This documentation is for Spark version {{site.SPARK_VERSION}}. Spark uses Hadoop's client libraries for HDFS and YARN. Downloads are pre-packaged for a handful of popular Hadoop versions.
 Users can also download a "Hadoop free" binary and run Spark with any Hadoop version
 [by augmenting Spark's classpath](hadoop-provided.html).
-Scala and Java users can include Spark in their projects using its Maven coordinates and in the future Python users can also install Spark from PyPI.
+Scala and Java users can include Spark in their projects using its Maven coordinates and Python users can install Spark from PyPI.
 
 
 If you'd like to build Spark from 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org