You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/10/31 04:07:51 UTC

[GitHub] [spark-docker] dcoliversun commented on pull request #21: [SPARK-40569] Add smoke test in standalone cluster for spark-docker

dcoliversun commented on PR #21:
URL: https://github.com/apache/spark-docker/pull/21#issuecomment-1296512705

   ```
   #13 [linux/amd64 4/9] RUN set -ex;     export SPARK_TMP="$(mktemp -d)";     cd $SPARK_TMP;     wget -nv -O spark.tgz "https://dlcdn.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz";     wget -nv -O spark.tgz.asc "https://downloads.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz.asc";     export GNUPGHOME="$(mktemp -d)";     gpg --keyserver hkps://keys.openpgp.org --recv-key "80FB8EBE8EBA68504989703491B5DC815DBF10D3" ||     gpg --keyserver hkps://keyserver.ubuntu.com --recv-keys "80FB8EBE8EBA68504989703491B5DC815DBF10D3";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf "$GNUPGHOME" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pysp
 ark/;     mv python/lib /opt/spark/python/lib/;     cd ..;     rm -rf "$SPARK_TMP";
   #0 0.132 ++ mktemp -d
   #0 0.133 + export SPARK_TMP=/tmp/tmp.oEdW8CyP9h
   #0 0.133 + SPARK_TMP=/tmp/tmp.oEdW8CyP9h
   #0 0.133 + cd /tmp/tmp.oEdW8CyP9h
   #0 0.133 + wget -nv -O spark.tgz https://dlcdn.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz
   #0 0.152 https://dlcdn.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-hadoop3.tgz:
   #0 0.152 2022-10-31 04:06:44 ERROR 404: Not Found.
   #13 ERROR: process "/bin/sh -c set -ex;     export SPARK_TMP=\"$(mktemp -d)\";     cd $SPARK_TMP;     wget -nv -O spark.tgz \"$SPARK_TGZ_URL\";     wget -nv -O spark.tgz.asc \"$SPARK_TGZ_ASC_URL\";     export GNUPGHOME=\"$(mktemp -d)\";     gpg --keyserver hkps://keys.openpgp.org --recv-key \"$GPG_KEY\" ||     gpg --keyserver hkps://keyserver.ubuntu.com --recv-keys \"$GPG_KEY\";     gpg --batch --verify spark.tgz.asc spark.tgz;     gpgconf --kill all;     rm -rf \"$GNUPGHOME\" spark.tgz.asc;         tar -xf spark.tgz --strip-components=1;     chown -R spark:spark .;     mv jars /opt/spark/;     mv bin /opt/spark/;     mv sbin /opt/spark/;     mv kubernetes/dockerfiles/spark/decom.sh /opt/;     mv examples /opt/spark/;     mv kubernetes/tests /opt/spark/;     mv data /opt/spark/;     mv python/pyspark /opt/spark/python/pyspark/;     mv python/lib /opt/spark/python/lib/;     cd ..;     rm -rf \"$SPARK_TMP\";" did not complete successfully: exit code: 8
   ```
   Same issues in GA. @Yikun 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org