You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/06/15 18:27:00 UTC

[jira] [Resolved] (SPARK-31994) Docker image should use `https` urls for only mirrors that support it(SSL)

     [ https://issues.apache.org/jira/browse/SPARK-31994?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun resolved SPARK-31994.
-----------------------------------
    Fix Version/s: 3.1.0
                   3.0.1
       Resolution: Fixed

Issue resolved by pull request 28834
[https://github.com/apache/spark/pull/28834]

> Docker image should use `https` urls for only mirrors that support it(SSL)
> --------------------------------------------------------------------------
>
>                 Key: SPARK-31994
>                 URL: https://issues.apache.org/jira/browse/SPARK-31994
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.0.0, 3.1.0
>            Reporter: Prashant Sharma
>            Assignee: Prashant Sharma
>            Priority: Major
>             Fix For: 3.0.1, 3.1.0
>
>
> It appears, that security.debian.org does not support https.
> {code}
> curl https://security.debian.org
> curl: (35) LibreSSL SSL_connect: SSL_ERROR_SYSCALL in connection to security.debian.org:443 
> {code}
> While building the image, it fails in the following way.
> {code}
> MacBook-Pro:spark prashantsharma$ bin/docker-image-tool.sh -r scrapcodes -t v3.1.0-1 build
> Sending build context to Docker daemon  222.1MB
> Step 1/18 : ARG java_image_tag=8-jre-slim
> Step 2/18 : FROM openjdk:${java_image_tag}
>  ---> 381b20190cf7
> Step 3/18 : ARG spark_uid=185
>  ---> Using cache
>  ---> 65c06f86753c
> Step 4/18 : RUN set -ex &&     sed -i 's/http:/https:/g' /etc/apt/sources.list &&     apt-get update &&     ln -s /lib /lib64 &&     apt install -y bash tini libc6 libpam-modules krb5-user libnss3 procps &&     mkdir -p /opt/spark &&     mkdir -p /opt/spark/examples &&     mkdir -p /opt/spark/work-dir &&     touch /opt/spark/RELEASE &&     rm /bin/sh &&     ln -sv /bin/bash /bin/sh &&     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su &&     chgrp root /etc/passwd && chmod ug+rw /etc/passwd &&     rm -rf /var/cache/apt/*
>  ---> Running in a3461dadd6eb
> + sed -i s/http:/https:/g /etc/apt/sources.list
> + apt-get update
> Ign:1 https://security.debian.org/debian-security buster/updates InRelease
> Err:2 https://security.debian.org/debian-security buster/updates Release
>   Could not handshake: The TLS connection was non-properly terminated. [IP: 151.101.0.204 443]
> Get:3 https://deb.debian.org/debian buster InRelease [121 kB]
> Get:4 https://deb.debian.org/debian buster-updates InRelease [51.9 kB]
> Get:5 https://deb.debian.org/debian buster/main amd64 Packages [7905 kB]
> Get:6 https://deb.debian.org/debian buster-updates/main amd64 Packages [7868 B]
> Reading package lists...
> E: The repository 'https://security.debian.org/debian-security buster/updates Release' does not have a Release file.
> The command '/bin/sh -c set -ex &&     sed -i 's/http:/https:/g' /etc/apt/sources.list &&     apt-get update &&     ln -s /lib /lib64 &&     apt install -y bash tini libc6 libpam-modules krb5-user libnss3 procps &&     mkdir -p /opt/spark &&     mkdir -p /opt/spark/examples &&     mkdir -p /opt/spark/work-dir &&     touch /opt/spark/RELEASE &&     rm /bin/sh &&     ln -sv /bin/bash /bin/sh &&     echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su &&     chgrp root /etc/passwd && chmod ug+rw /etc/passwd &&     rm -rf /var/cache/apt/*' returned a non-zero code: 100
> Failed to build Spark JVM Docker image, please refer to Docker build output for details.
> {code}
> So, if we limit the https support to only deb.debian.org, that does the trick.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org