You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Emmanuel Arias (JIRA)" <ji...@apache.org> on 2019/02/01 02:04:00 UTC

[jira] [Created] (SPARK-26807) Confusing documentation regarding installation from PyPi

Emmanuel Arias created SPARK-26807:
--------------------------------------

             Summary: Confusing documentation regarding installation from PyPi
                 Key: SPARK-26807
                 URL: https://issues.apache.org/jira/browse/SPARK-26807
             Project: Spark
          Issue Type: Documentation
          Components: Documentation
    Affects Versions: 2.4.0
            Reporter: Emmanuel Arias


Hello!

I am new using Spark. Reading the documentation I think that is a little confusing on Downloading section.

[ttps://spark.apache.org/docs/latest/#downloading|https://spark.apache.org/docs/latest/#downloading] write: "Scala and Java users can include Spark in their projects using its Maven coordinates and in the future Python users can also install Spark from PyPI.", I interpret that currently Spark is not on PyPi yet. But  [https://spark.apache.org/downloads.html] write: "[PySpark|https://pypi.python.org/pypi/pyspark] is now available in pypi. To install just run {{pip install pyspark}}."



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org