You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2022/01/28 09:07:38 UTC

[spark] branch master updated: [SPARK-38049][K8S][TESTS] Use Java 17 in K8s integration tests by default

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 169be3e  [SPARK-38049][K8S][TESTS] Use Java 17 in K8s integration tests by default
169be3e is described below

commit 169be3e1e6c8743390d3ca401f762b69b328ccfd
Author: Dongjoon Hyun <do...@apache.org>
AuthorDate: Fri Jan 28 01:06:34 2022 -0800

    [SPARK-38049][K8S][TESTS] Use Java 17 in K8s integration tests by default
    
    ### What changes were proposed in this pull request?
    
    This PR aims to use `Java 17` in K8s integration tests by default.
    
    ### Why are the changes needed?
    
    Java 8 cannot run Java11/17-built Spark distribution.
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    
    Manually run the following and check the Java version in the generated docker images.
    
    **SBT**
    ```
    $ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=minikube -Dspark.kubernetes.test.deployMode=docker-for-desktop "kubernetes-integration-tests/test"
    ...
    [info] KubernetesSuite:
    [info] - Run SparkPi with no resources (8 seconds, 949 milliseconds)
    [info] - Run SparkPi with no resources & statefulset allocation (8 seconds, 515 milliseconds)
    [info] - Run SparkPi with a very long application name. (8 seconds, 389 milliseconds)
    [info] - Use SparkLauncher.NO_RESOURCE (8 seconds, 393 milliseconds)
    [info] - Run SparkPi with a master URL without a scheme. (8 seconds, 360 milliseconds)
    [info] - Run SparkPi with an argument. (8 seconds, 435 milliseconds)
    [info] - Run SparkPi with custom labels, annotations, and environment variables. (8 seconds, 611 milliseconds)
    [info] - All pods have the same service account by default (8 seconds, 353 milliseconds)
    [info] - Run extraJVMOptions check on driver (4 seconds, 364 milliseconds)
    [info] - Run SparkRemoteFileTest using a remote data file (8 seconds, 392 milliseconds)
    [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (14 seconds, 564 milliseconds)
    [info] - Run SparkPi with env and mount secrets. (16 seconds, 868 milliseconds)
    [info] - Run PySpark on simple pi.py example (9 seconds, 632 milliseconds)
    [info] - Run PySpark to test a pyfiles example (10 seconds, 520 milliseconds)
    [info] - Run PySpark with memory customization (8 seconds, 385 milliseconds)
    [info] - Run in client mode. (7 seconds, 336 milliseconds)
    [info] - Start pod creation from template (8 seconds, 727 milliseconds)
    [info] - Test basic decommissioning (42 seconds, 353 milliseconds)
    [info] - Test basic decommissioning with shuffle cleanup (42 seconds, 532 milliseconds)
    [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 40 seconds)
    [info] - Test decommissioning timeouts (42 seconds, 211 milliseconds)
    [info] - SPARK-37576: Rolling decommissioning (1 minute, 7 seconds)
    [info] - Run SparkR on simple dataframe.R example (12 seconds, 16 milliseconds)
    [info] Run completed in 11 minutes, 24 seconds.
    [info] Total number of tests run: 23
    [info] Suites: completed 1, aborted 0
    [info] Tests: succeeded 23, failed 0, canceled 0, ignored 0, pending 0
    [info] All tests passed.
    ```
    
    **MAVEN**
    ```
    $ mvn package -Pkubernetes -DskipTests
    $ resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh --deploy-mode docker-for-desktop --namespace default --exclude-tags minikube,r
    ...
    KubernetesSuite:
    - Run SparkPi with no resources
    - Run SparkPi with no resources & statefulset allocation
    - Run SparkPi with a very long application name.
    - Use SparkLauncher.NO_RESOURCE
    - Run SparkPi with a master URL without a scheme.
    - Run SparkPi with an argument.
    - Run SparkPi with custom labels, annotations, and environment variables.
    - All pods have the same service account by default
    - Run extraJVMOptions check on driver
    - Run SparkRemoteFileTest using a remote data file
    - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties
    - Run SparkPi with env and mount secrets.
    - Run PySpark on simple pi.py example
    - Run PySpark to test a pyfiles example
    - Run PySpark with memory customization
    - Run in client mode.
    - Start pod creation from template
    - Test basic decommissioning
    - Test basic decommissioning with shuffle cleanup
    - Test decommissioning with dynamic allocation & shuffle cleanups
    - Test decommissioning timeouts
    - SPARK-37576: Rolling decommissioning
    Run completed in 8 minutes, 52 seconds.
    Total number of tests run: 22
    Suites: completed 2, aborted 0
    Tests: succeeded 22, failed 0, canceled 0, ignored 0, pending 0
    All tests passed.
    ```
    
    ```
    $ docker run -it --rm kubespark/spark:3.3.0-SNAPSHOT_C0A4AD5A-2561-4972-B2DE-0FDA941F8064 java -version | tail -n1
    OpenJDK 64-Bit Server VM (build 17.0.2+8-Debian-1deb11u1, mixed mode, sharing)
    ```
    
    Closes #35346 from dongjoon-hyun/SPARK-38049.
    
    Authored-by: Dongjoon Hyun <do...@apache.org>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 project/SparkBuild.scala                                      | 11 ++++++-----
 resource-managers/kubernetes/integration-tests/pom.xml        |  4 ++--
 .../integration-tests/scripts/setup-integration-test-env.sh   |  2 +-
 3 files changed, 9 insertions(+), 8 deletions(-)

diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index ad9aef5..8a56bef 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -639,12 +639,13 @@ object KubernetesIntegrationTests {
       if (shouldBuildImage) {
         val dockerTool = s"$sparkHome/bin/docker-image-tool.sh"
         val bindingsDir = s"$sparkHome/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/bindings"
-        val dockerFile = sys.props.get("spark.kubernetes.test.dockerFile")
-        val javaImageTag = sys.props.getOrElse("spark.kubernetes.test.javaImageTag", "8-jre-slim")
-        val extraOptions = if (dockerFile.isDefined) {
-          Seq("-f", s"${dockerFile.get}")
-        } else {
+        val javaImageTag = sys.props.get("spark.kubernetes.test.javaImageTag")
+        val dockerFile = sys.props.getOrElse("spark.kubernetes.test.dockerFile",
+            "resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17")
+        val extraOptions = if (javaImageTag.isDefined) {
           Seq("-b", s"java_image_tag=$javaImageTag")
+        } else {
+          Seq("-f", s"$dockerFile")
         }
         val cmd = Seq(dockerTool,
           "-t", imageTag.value,
diff --git a/resource-managers/kubernetes/integration-tests/pom.xml b/resource-managers/kubernetes/integration-tests/pom.xml
index 4c5f14b..a44cedb 100644
--- a/resource-managers/kubernetes/integration-tests/pom.xml
+++ b/resource-managers/kubernetes/integration-tests/pom.xml
@@ -35,7 +35,7 @@
     <spark.kubernetes.test.sparkTgz>N/A</spark.kubernetes.test.sparkTgz>
     <spark.kubernetes.test.unpackSparkDir>${project.build.directory}/spark-dist-unpacked</spark.kubernetes.test.unpackSparkDir>
     <spark.kubernetes.test.imageTag>N/A</spark.kubernetes.test.imageTag>
-    <spark.kubernetes.test.javaImageTag>8-jre-slim</spark.kubernetes.test.javaImageTag>
+    <spark.kubernetes.test.javaImageTag>N/A</spark.kubernetes.test.javaImageTag>
     <spark.kubernetes.test.imageTagFile>${project.build.directory}/imageTag.txt</spark.kubernetes.test.imageTagFile>
     <spark.kubernetes.test.deployMode>minikube</spark.kubernetes.test.deployMode>
     <spark.kubernetes.test.imageRepo>docker.io/kubespark</spark.kubernetes.test.imageRepo>
@@ -43,7 +43,7 @@
     <spark.kubernetes.test.master></spark.kubernetes.test.master>
     <spark.kubernetes.test.namespace></spark.kubernetes.test.namespace>
     <spark.kubernetes.test.serviceAccountName></spark.kubernetes.test.serviceAccountName>
-    <spark.kubernetes.test.dockerFile>N/A</spark.kubernetes.test.dockerFile>
+    <spark.kubernetes.test.dockerFile>resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17</spark.kubernetes.test.dockerFile>
 
     <test.exclude.tags></test.exclude.tags>
     <test.include.tags></test.include.tags>
diff --git a/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh b/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
index 562d1d8..f79b1f8 100755
--- a/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
+++ b/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
@@ -23,7 +23,7 @@ IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
 DEPLOY_MODE="minikube"
 IMAGE_REPO="docker.io/kubespark"
 IMAGE_TAG="N/A"
-JAVA_IMAGE_TAG="8-jre-slim"
+JAVA_IMAGE_TAG="N/A"
 SPARK_TGZ="N/A"
 MVN="$TEST_ROOT_DIR/build/mvn"
 DOCKER_FILE="N/A"

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org