You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2021/11/29 05:45:26 UTC

[spark] branch master updated: [SPARK-37319][K8S][FOLLOWUP] Set JAVA_HOME for Java 17 installed by apt-get

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new a3886ba  [SPARK-37319][K8S][FOLLOWUP] Set JAVA_HOME for Java 17 installed by apt-get
a3886ba is described below

commit a3886ba976469bef0dfafc3da8686a53c5a59d95
Author: Kousuke Saruta <sa...@oss.nttdata.com>
AuthorDate: Sun Nov 28 21:44:42 2021 -0800

    [SPARK-37319][K8S][FOLLOWUP] Set JAVA_HOME for Java 17 installed by apt-get
    
    ### What changes were proposed in this pull request?
    
    This PR adds a configuration to `Dockerfile.java17` to set the environment variable `JAVA_HOME` for Java 17 installed by apt-get.
    
    ### Why are the changes needed?
    
    In `entrypoint.sh`, `${JAVA_HOME}/bin/java` is used but the container build from `Dockerfile.java17` is not set the environment variable.
    As a result, executors can't launch.
    ```
    + CMD=(${JAVA_HOME}/bin/java "${SPARK_EXECUTOR_JAVA_OPTS[]}" -Xms$SPARK_EXECUTOR_MEMORY -Xmx$SPARK_EXECUTOR_MEMORY -cp "$SPARK_CLASSPATH:$SPARK_DIST_CLASSPATH" org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBackend --driver-url $SPARK_DRIVER_URL --executor-id $SPARK_EXECUTOR_ID --cores $SPARK_EXECUTOR_CORES --app-id $SPARK_APPLICATION_ID --hostname $SPARK_EXECUTOR_POD_IP --resourceProfileId $SPARK_RESOURCE_PROFILE_ID --podName $SPARK_EXECUTOR_POD_NAME)
    + exec /usr/bin/tini -s -- /bin/java -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.at [...]
    [FATAL tini (15)] exec /bin/java failed: No such file or directory
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Confirmed that the following simple job can run successfully with a container image build from the modified `Dockerfile.java17`.
    ```
    $ bin/spark-shell --master k8s://https://<host>:<port> --conf spark.kubernetes.container.image=spark:<tag>
    scala> spark.range(10).show
    +---+
    | id|
    +---+
    |  0|
    |  1|
    |  2|
    |  3|
    |  4|
    |  5|
    |  6|
    |  7|
    |  8|
    |  9|
    +---+
    ```
    
    Closes #34722 from sarutak/java17-home-kube.
    
    Authored-by: Kousuke Saruta <sa...@oss.nttdata.com>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 .../kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17       | 1 +
 1 file changed, 1 insertion(+)

diff --git a/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17 b/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17
index f9ab64e..96dd6c9 100644
--- a/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17
+++ b/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17
@@ -51,6 +51,7 @@ COPY kubernetes/tests /opt/spark/tests
 COPY data /opt/spark/data
 
 ENV SPARK_HOME /opt/spark
+ENV JAVA_HOME /usr/lib/jvm/java-17-openjdk-amd64/
 
 WORKDIR /opt/spark/work-dir
 RUN chmod g+w /opt/spark/work-dir

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org