You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/03/02 05:25:00 UTC

[jira] [Resolved] (SPARK-26807) Confusing documentation regarding installation from PyPi

     [ https://issues.apache.org/jira/browse/SPARK-26807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-26807.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 3.0.0

Issue resolved by pull request 23933
[https://github.com/apache/spark/pull/23933]

> Confusing documentation regarding installation from PyPi
> --------------------------------------------------------
>
>                 Key: SPARK-26807
>                 URL: https://issues.apache.org/jira/browse/SPARK-26807
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 2.4.0
>            Reporter: Emmanuel Arias
>            Assignee: Sean Owen
>            Priority: Trivial
>             Fix For: 3.0.0
>
>
> Hello!
> I am new using Spark. Reading the documentation I think that is a little confusing on Downloading section.
> [ttps://spark.apache.org/docs/latest/#downloading|https://spark.apache.org/docs/latest/#downloading] write: "Scala and Java users can include Spark in their projects using its Maven coordinates and in the future Python users can also install Spark from PyPI.", I interpret that currently Spark is not on PyPi yet. But  [https://spark.apache.org/downloads.html] write: "[PySpark|https://pypi.python.org/pypi/pyspark] is now available in pypi. To install just run {{pip install pyspark}}."



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org