You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/03/02 23:02:18 UTC

[GitHub] [spark] dongjoon-hyun commented on pull request #35376: [SPARK-38081][K8S][TESTS] Support `cloud`-backend in K8s IT with SBT

dongjoon-hyun commented on pull request #35376:
URL: https://github.com/apache/spark/pull/35376#issuecomment-1057478328


   I backported this test-only patch to branch-3.2 for Apache Spark 3.2.2. This is verified with `EKS Graviton (ARM64)` cluster directly on the latest `branch-3.2` + this patch.
   ```
   $ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=minikube,local -Dspark.kubernetes.test.deployMode=cloud -Dspark.kubernetes.test.master=k8s://https://....eks.amazonaws.com -Dspark.kubernetes.test.imageRepo=... -Dspark.kubernetes.test.imageTag=20220302 "kubernetes-integration-tests/test"
   ...
   [info] KubernetesSuite:
   [info] - Run SparkPi with no resources (55 seconds, 72 milliseconds)
   [info] - Run SparkPi with a very long application name. (14 seconds, 66 milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (13 seconds, 491 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (13 seconds, 515 milliseconds)
   [info] - Run SparkPi with an argument. (13 seconds, 599 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment variables. (13 seconds, 895 milliseconds)
   [info] - All pods have the same service account by default (14 seconds, 12 milliseconds)
   [info] - Run extraJVMOptions check on driver (7 seconds, 538 milliseconds)
   [info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j.properties (15 seconds, 350 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (23 seconds, 595 milliseconds)
   [info] - Run PySpark on simple pi.py example (33 seconds, 942 milliseconds)
   [info] - Run PySpark to test a pyfiles example (18 seconds, 532 milliseconds)
   [info] - Run PySpark with memory customization (15 seconds, 747 milliseconds)
   [info] - Run in client mode. (10 seconds, 351 milliseconds)
   [info] - Start pod creation from template (13 seconds, 869 milliseconds)
   [info] - Test basic decommissioning (47 seconds, 300 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (48 seconds, 170 milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 49 seconds)
   [info] - Test decommissioning timeouts (47 seconds, 304 milliseconds)
   [info] - Run SparkR on simple dataframe.R example (54 seconds, 275 milliseconds)
   [info] Run completed in 13 minutes, 29 seconds.
   [info] Total number of tests run: 20
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 20, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   [success] Total time: 1059 s (17:39), completed Mar 2, 2022, 3:01:10 PM
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org