You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2019/07/22 17:45:52 UTC

[spark] branch branch-2.4 updated: [SPARK-28468][INFRA][2.4] Upgrade pip to fix `sphinx` install error

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
     new c01c294  [SPARK-28468][INFRA][2.4] Upgrade pip to fix `sphinx` install error
c01c294 is described below

commit c01c294effd6fe98d528e2c317b37249ae1c3572
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Mon Jul 22 10:45:18 2019 -0700

    [SPARK-28468][INFRA][2.4] Upgrade pip to fix `sphinx` install error
    
    ## What changes were proposed in this pull request?
    
    Spark 2.4.x should be a LTS version and we should use the release script in `branch-2.4` to avoid the previous mistakes. Currently, `do-release-docker.sh` fails at `sphinx` installation to `Python 2.7` at `branch-2.4` only. This PR aims to upgrade `pip` to handle this.
    ```
    $ dev/create-release/do-release-docker.sh -d /tmp/spark-2.4.4 -n
    ...
    = Building spark-rm image with tag latest...
    Command: docker build -t spark-rm:latest --build-arg UID=501 /Users/dhyun/APACHE/spark-2.4/dev/create-release/spark-rm
    Log file: docker-build.log
    // Terminated.
    ```
    ```
    $ tail /tmp/spark-2.4.4/docker-build.log
    Collecting sphinx
      Downloading https://files.pythonhosted.org/packages/89/1e/64c77163706556b647f99d67b42fced9d39ae6b1b86673965a2cd28037b5/Sphinx-2.1.2.tar.gz (6.3MB)
        Complete output from command python setup.py egg_info:
        ERROR: Sphinx requires at least Python 3.5 to run.
    
        ----------------------------------------
    Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-2tylGA/sphinx/
    You are using pip version 8.1.1, however version 19.1.1 is available.
    You should consider upgrading via the 'pip install --upgrade pip' command.
    ```
    
    The following is the short reproducible step.
    ```
    $ docker build -t spark-rm-test2 --build-arg UID=501 dev/create-release/spark-rm
    ```
    
    ## How was this patch tested?
    
    Manual.
    ```
    $ docker build -t spark-rm-test2 --build-arg UID=501 dev/create-release/spark-rm
    ```
    
    Closes #25226 from dongjoon-hyun/SPARK-28468.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 dev/create-release/spark-rm/Dockerfile | 1 +
 1 file changed, 1 insertion(+)

diff --git a/dev/create-release/spark-rm/Dockerfile b/dev/create-release/spark-rm/Dockerfile
index bee6284..f78d01f 100644
--- a/dev/create-release/spark-rm/Dockerfile
+++ b/dev/create-release/spark-rm/Dockerfile
@@ -60,6 +60,7 @@ RUN apt-get clean && apt-get update && $APT_INSTALL gnupg ca-certificates apt-tr
   $APT_INSTALL nodejs && \
   # Install needed python packages. Use pip for installing packages (for consistency).
   $APT_INSTALL libpython2.7-dev libpython3-dev python-pip python3-pip && \
+  pip install --upgrade pip && hash -r pip && \
   pip install $BASE_PIP_PKGS && \
   pip install $PIP_PKGS && \
   cd && \


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org