You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2020/07/17 19:09:38 UTC
[spark] branch branch-3.0 updated: [SPARK-32353][TEST] Update
docker/spark-test and clean up unused stuff
This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push:
new 76496e7 [SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff
76496e7 is described below
commit 76496e70f155127519f28e2257de457bd4ee91b7
Author: William Hyun <wi...@gmail.com>
AuthorDate: Fri Jul 17 12:05:45 2020 -0700
[SPARK-32353][TEST] Update docker/spark-test and clean up unused stuff
### What changes were proposed in this pull request?
This PR aims to update the docker/spark-test and clean up unused stuff.
### Why are the changes needed?
Since Spark 3.0.0, Java 11 is supported. We had better use the latest Java and OS.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manually do the following as described in https://github.com/apache/spark/blob/master/external/docker/spark-test/README.md .
```
docker run -v $SPARK_HOME:/opt/spark spark-test-master
docker run -v $SPARK_HOME:/opt/spark spark-test-worker spark://<master_ip>:7077
```
Closes #29150 from williamhyun/docker.
Authored-by: William Hyun <wi...@gmail.com>
Signed-off-by: Dongjoon Hyun <do...@apache.org>
(cherry picked from commit 7dc1d8917dd01b5d5808460a5eb6e846795ab4bd)
Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
external/docker/spark-test/base/Dockerfile | 15 +++------------
external/docker/spark-test/master/default_cmd | 3 ---
external/docker/spark-test/worker/default_cmd | 3 ---
3 files changed, 3 insertions(+), 18 deletions(-)
diff --git a/external/docker/spark-test/base/Dockerfile b/external/docker/spark-test/base/Dockerfile
index 5bec5d3..d4a30c4 100644
--- a/external/docker/spark-test/base/Dockerfile
+++ b/external/docker/spark-test/base/Dockerfile
@@ -15,23 +15,14 @@
# limitations under the License.
#
-FROM ubuntu:xenial
+FROM ubuntu:20.04
# Upgrade package index
-# install a few other useful packages plus Open Jdk 8
+# install a few other useful packages plus Open Java 11
# Remove unneeded /var/lib/apt/lists/* after install to reduce the
# docker image size (by ~30MB)
RUN apt-get update && \
- apt-get install -y less openjdk-8-jre-headless iproute2 vim-tiny sudo openssh-server && \
+ apt-get install -y less openjdk-11-jre-headless iproute2 vim-tiny sudo openssh-server && \
rm -rf /var/lib/apt/lists/*
-ENV SCALA_VERSION 2.12.10
-ENV CDH_VERSION cdh4
-ENV SCALA_HOME /opt/scala-$SCALA_VERSION
ENV SPARK_HOME /opt/spark
-ENV PATH $SPARK_HOME:$SCALA_HOME/bin:$PATH
-
-# Install Scala
-ADD https://www.scala-lang.org/files/archive/scala-$SCALA_VERSION.tgz /
-RUN (cd / && gunzip < scala-$SCALA_VERSION.tgz)|(cd /opt && tar -xvf -)
-RUN rm /scala-$SCALA_VERSION.tgz
diff --git a/external/docker/spark-test/master/default_cmd b/external/docker/spark-test/master/default_cmd
index 5a7da34..96a36cd 100755
--- a/external/docker/spark-test/master/default_cmd
+++ b/external/docker/spark-test/master/default_cmd
@@ -22,7 +22,4 @@ echo "CONTAINER_IP=$IP"
export SPARK_LOCAL_IP=$IP
export SPARK_PUBLIC_DNS=$IP
-# Avoid the default Docker behavior of mapping our IP address to an unreachable host name
-umount /etc/hosts
-
/opt/spark/bin/spark-class org.apache.spark.deploy.master.Master -i $IP
diff --git a/external/docker/spark-test/worker/default_cmd b/external/docker/spark-test/worker/default_cmd
index 31b06cb..2401f55 100755
--- a/external/docker/spark-test/worker/default_cmd
+++ b/external/docker/spark-test/worker/default_cmd
@@ -22,7 +22,4 @@ echo "CONTAINER_IP=$IP"
export SPARK_LOCAL_IP=$IP
export SPARK_PUBLIC_DNS=$IP
-# Avoid the default Docker behavior of mapping our IP address to an unreachable host name
-umount /etc/hosts
-
/opt/spark/bin/spark-class org.apache.spark.deploy.worker.Worker $1
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org