You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/04 01:42:27 UTC

[GitHub] [spark] srowen commented on a change in pull request #29640: [SPARK-32180][PYTHON][DOCS] Installation page of Getting Started in PySpark documentation

srowen commented on a change in pull request #29640:
URL: https://github.com/apache/spark/pull/29640#discussion_r483338273



##########
File path: python/docs/source/getting_started/index.rst
##########
@@ -20,7 +20,10 @@
 Getting Started
 ===============
 
+This page lists an overview of the basic steps required to setup & get started with PySpark.

Review comment:
       "This page summarizes the basic steps ..."
   & -> and

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.

Review comment:
       "Official releases are available from ..."

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:

Review comment:
       Why repeat this?

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:

Review comment:
       In earlier Conda versions ... should be used:
   (earlier than what?)

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:
+
+.. code-block:: bash
+
+	pip install pyspark
+
+`PySpark at Conda <https://anaconda.org/conda-forge/pyspark>`_ is not the official release.
+
+Official Release Channel
+------------------------
+
+Different flavor of PySpark is available in `the official release channel <https://spark.apache.org/downloads.html>`_.
+Any suitable version can be downloaded and extracted as below:
+
+.. code-block:: bash
+
+    tar xzvf spark-3.0.0-bin-hadoop2.7.tgz
+
+An important step is to ensure ``SPARK_HOME`` environment variable points to the directory where the code has been extracted. 

Review comment:
       Ensure the `SPARK_HOME` ...

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:
+
+.. code-block:: bash
+
+	pip install pyspark
+
+`PySpark at Conda <https://anaconda.org/conda-forge/pyspark>`_ is not the official release.
+
+Official Release Channel
+------------------------
+
+Different flavor of PySpark is available in `the official release channel <https://spark.apache.org/downloads.html>`_.
+Any suitable version can be downloaded and extracted as below:
+
+.. code-block:: bash
+
+    tar xzvf spark-3.0.0-bin-hadoop2.7.tgz
+
+An important step is to ensure ``SPARK_HOME`` environment variable points to the directory where the code has been extracted. 
+The next step is to properly define ``PYTHONPATH`` such that it can find the PySpark and 
+Py4J under ``$SPARK_HOME/python/lib``, one example of doing this is shown below:
+
+.. code-block:: bash
+
+    cd spark-3.0.0-bin-hadoop2.7
+    export SPARK_HOME=`pwd`
+    export PYTHONPATH=$(ZIPS=("$SPARK_HOME"/python/lib/*.zip); IFS=:; echo "${ZIPS[*]}"):$PYTHONPATH
+
+Installing from Source
+----------------------
+
+To install PySpark from source, refer `Building Spark <https://spark.apache.org/docs/latest/building-spark.html>`_.
+
+Steps for defining ``PYTHONPATH`` is same as described in `Official Release Channel <#official-release-channel>`_. 

Review comment:
       Refer to ... for steps to define `PYTHON_PATH`

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:

Review comment:
       I would use Conda or `conda` consistently, depending on whether you mean the product or command

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use

Review comment:
       remove "also"
   pip -> `pip`
   "is usually for standalone..."

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 

Review comment:
       What does this mean?

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:
+
+.. code-block:: bash
+
+	pip install pyspark
+
+`PySpark at Conda <https://anaconda.org/conda-forge/pyspark>`_ is not the official release.
+
+Official Release Channel
+------------------------
+
+Different flavor of PySpark is available in `the official release channel <https://spark.apache.org/downloads.html>`_.

Review comment:
       flavor -> flavors
   What do flavors mean here?
   

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:
+
+.. code-block:: bash
+
+	pip install pyspark
+
+`PySpark at Conda <https://anaconda.org/conda-forge/pyspark>`_ is not the official release.
+
+Official Release Channel
+------------------------
+
+Different flavor of PySpark is available in `the official release channel <https://spark.apache.org/downloads.html>`_.
+Any suitable version can be downloaded and extracted as below:
+
+.. code-block:: bash
+
+    tar xzvf spark-3.0.0-bin-hadoop2.7.tgz
+
+An important step is to ensure ``SPARK_HOME`` environment variable points to the directory where the code has been extracted. 
+The next step is to properly define ``PYTHONPATH`` such that it can find the PySpark and 

Review comment:
       Just: Define `PYTHON_PATH` such that ...

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:
+
+.. code-block:: bash
+
+	pip install pyspark
+
+`PySpark at Conda <https://anaconda.org/conda-forge/pyspark>`_ is not the official release.
+
+Official Release Channel
+------------------------
+
+Different flavor of PySpark is available in `the official release channel <https://spark.apache.org/downloads.html>`_.
+Any suitable version can be downloaded and extracted as below:
+
+.. code-block:: bash
+
+    tar xzvf spark-3.0.0-bin-hadoop2.7.tgz
+
+An important step is to ensure ``SPARK_HOME`` environment variable points to the directory where the code has been extracted. 
+The next step is to properly define ``PYTHONPATH`` such that it can find the PySpark and 
+Py4J under ``$SPARK_HOME/python/lib``, one example of doing this is shown below:
+
+.. code-block:: bash
+
+    cd spark-3.0.0-bin-hadoop2.7
+    export SPARK_HOME=`pwd`
+    export PYTHONPATH=$(ZIPS=("$SPARK_HOME"/python/lib/*.zip); IFS=:; echo "${ZIPS[*]}"):$PYTHONPATH
+
+Installing from Source
+----------------------
+
+To install PySpark from source, refer `Building Spark <https://spark.apache.org/docs/latest/building-spark.html>`_.
+
+Steps for defining ``PYTHONPATH`` is same as described in `Official Release Channel <#official-release-channel>`_. 
+
+Dependencies
+------------
+============= ========================= ====================
+Package       Minimum supported version Note
+============= ========================= ====================
+`pandas`      0.23.2                    Optional
+`NumPy`       1.7                       Optional
+`pyarrow`     0.15.1                    Optional
+`Py4J`        0.10.9                    Required
+============= ========================= ====================
+
+**Note**: A prerequisite for PySpark installation is the availability of ``JAVA 8 or 11`` and ``JAVA_HOME`` properly set.

Review comment:
       Just say Java 8 or later

##########
File path: python/docs/source/getting_started/installation.rst
##########
@@ -0,0 +1,119 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+============
+Installation
+============
+
+The official release channel is to download it from `the Apache Spark website <https://spark.apache.org/downloads.html>`_.
+Alternatively, you can also install it via pip from PyPI.  PyPI installation is usually to use
+standalone locally or as a client to connect to a cluster instead of setting a cluster up.  
+ 
+This page includes the instructions for installing PySpark by using pip, Conda, downloading manually, and building it from the source.
+
+Python Version Supported
+------------------------
+
+Python 3.6 and above.
+
+Using PyPI
+----------
+
+PySpark installation using `PyPI <https://pypi.org/project/pyspark/>`_
+
+.. code-block:: bash
+
+    pip install pyspark
+	
+Using Conda  
+-----------
+
+Conda is an open-source package management and environment management system which is a part of `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and language agnostic.
+  
+Conda can be used to create a virtual environment from terminal as shown below:
+
+.. code-block:: bash
+
+	conda create -n pyspark_env 
+
+After the virtual environment is created, it should be visible under the list of conda environments which can be seen using the following command:
+
+.. code-block:: bash
+
+	conda env list
+
+The newly created environment can be accessed using the following command:
+
+.. code-block:: bash
+
+	conda activate pyspark_env
+
+In lower Conda version, the following command might be used:
+
+.. code-block:: bash
+
+	source activate pyspark_env
+
+PySpark installation using ``pip`` under Conda environment is official. 
+
+PySpark can be installed in this newly created environment using PyPI as shown before:
+
+.. code-block:: bash
+
+	pip install pyspark
+
+`PySpark at Conda <https://anaconda.org/conda-forge/pyspark>`_ is not the official release.
+
+Official Release Channel
+------------------------
+
+Different flavor of PySpark is available in `the official release channel <https://spark.apache.org/downloads.html>`_.
+Any suitable version can be downloaded and extracted as below:
+
+.. code-block:: bash
+
+    tar xzvf spark-3.0.0-bin-hadoop2.7.tgz
+
+An important step is to ensure ``SPARK_HOME`` environment variable points to the directory where the code has been extracted. 
+The next step is to properly define ``PYTHONPATH`` such that it can find the PySpark and 
+Py4J under ``$SPARK_HOME/python/lib``, one example of doing this is shown below:
+
+.. code-block:: bash
+
+    cd spark-3.0.0-bin-hadoop2.7
+    export SPARK_HOME=`pwd`
+    export PYTHONPATH=$(ZIPS=("$SPARK_HOME"/python/lib/*.zip); IFS=:; echo "${ZIPS[*]}"):$PYTHONPATH
+
+Installing from Source
+----------------------
+
+To install PySpark from source, refer `Building Spark <https://spark.apache.org/docs/latest/building-spark.html>`_.

Review comment:
       refer to ...




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org