You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/19 10:31:00 UTC

[GitHub] [spark] zero323 commented on a change in pull request #34315: [SPARK-37050][PYTHON] Update Conda installation instructions

zero323 commented on a change in pull request #34315:
URL: https://github.com/apache/spark/pull/34315#discussion_r731723248



##########
File path: python/docs/source/getting_started/install.rst
##########
@@ -83,46 +83,54 @@ Note that this installation way of PySpark with/without a specific Hadoop versio
 Using Conda
 -----------
 
-Conda is an open-source package management and environment management system which is a part of
-the `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both cross-platform and
-language agnostic. In practice, Conda can replace both `pip <https://pip.pypa.io/en/latest/>`_ and
-`virtualenv <https://virtualenv.pypa.io/en/latest/>`_.
-
-Create new virtual environment from your terminal as shown below:
-
-.. code-block:: bash
-
-    conda create -n pyspark_env
-
-After the virtual environment is created, it should be visible under the list of Conda environments
-which can be seen using the following command:
+Conda is an open-source package management and environment management system (developed by
+`Anaconda <https://www.anaconda.com/>`_), which is best installed through
+`Miniconda <https://docs.conda.io/en/latest/miniconda.html/>`_ or `Miniforge <https://github.com/conda-forge/miniforge/>`_.
+The tool is both cross-platform and language agnostic, and in practice, conda can replace both
+`pip <https://pip.pypa.io/en/latest/>`_ and `virtualenv <https://virtualenv.pypa.io/en/latest/>`_.
+
+Conda uses so-called channels to distribute packages, and together with the default channels by
+Anaconda itself, the most important channel is `conda-forge <https://conda-forge.org/>`_, which
+is the community-driven packaging effort that is the most extensive & the most current (and also
+serves as the upstream for the Anaconda channels in most cases).
+
+Generally, it is recommended to use *as few channels as possible*. Conda-forge & Anaconda put a
+lot of effort in guaranteeing binary compatibility between packages (e.g. by using compatible
+compilers for all packages and tracking which packages are ABI-relevant). Needlessly mixing in
+other channels can end up breaking those guarantees, which is why conda-forge even recommends
+so-called "strict channel priority":
 
 .. code-block:: bash
 
-    conda env list
+    conda config --add channels conda-forge
+    conda config --set channel_priority strict
 
-Now activate the newly created environment with the following command:
+To create a new conda environment from your terminal and activate it, proceed as shown below:
 
 .. code-block:: bash
 
+    conda create -n pyspark_env
     conda activate pyspark_env
 
-You can install pyspark by `Using PyPI <#using-pypi>`_ to install PySpark in the newly created
-environment, for example as below. It will install PySpark under the new virtual environment
-``pyspark_env`` created above.
+After activating the environment, use the following command to install pyspark,
+a python version of your choice, as well as other packages you want to use in
+the same session as pyspark (you can install in several steps too).
 
 .. code-block:: bash
 
-    pip install pyspark
+    conda install -c conda-forge pyspark python [other packages]  # can also use python=3.8, etc.

Review comment:
       Could we just link to [Using Pip in a Conda Environment](https://www.anaconda.com/blog/using-pip-in-a-conda-environment)?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org