You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2022/03/09 05:17:11 UTC
[spark] branch master updated: [SPARK-38452][K8S][TESTS] Support pyDockerfile and rDockerfile in SBT K8s IT
This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 66ff4b6 [SPARK-38452][K8S][TESTS] Support pyDockerfile and rDockerfile in SBT K8s IT
66ff4b6 is described below
commit 66ff4b6d1b6fe22bd025bb645e12272e2d79ad0d
Author: Yikun Jiang <yi...@gmail.com>
AuthorDate: Tue Mar 8 21:16:12 2022 -0800
[SPARK-38452][K8S][TESTS] Support pyDockerfile and rDockerfile in SBT K8s IT
### What changes were proposed in this pull request?
Support pyDockerfile and rDockerfile in SBT K8s IT
### Why are the changes needed?
Enable users to specify `pyDockerfile` and `rDockerfile` separately.
### Does this PR introduce _any_ user-facing change?
No, test only
### How was this patch tested?
```
build/sbt -Pkubernetes -Pkubernetes-integration-tests \
-Dtest.exclude.tags=minikube -Dspark.kubernetes.test.deployMode=docker-desktop \
-Dspark.kubernetes.test.pyDockerFile=/Users/yikun/code/Dockerfile.py \
-Dspark.kubernetes.test.rDockerFile=/Users/yikun/code/Dockerfile.r \
-Dspark.kubernetes.test.DockerFile=/Users/yikun/code/Dockerfile "kubernetes-integration-tests/test"
```
Closes #35772 from Yikun/SPARK-38452.
Authored-by: Yikun Jiang <yi...@gmail.com>
Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
project/SparkBuild.scala | 8 ++++++--
resource-managers/kubernetes/integration-tests/README.md | 11 +++++++++++
2 files changed, 17 insertions(+), 2 deletions(-)
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 0f06e6b..b536b50 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -646,6 +646,10 @@ object KubernetesIntegrationTests {
val javaImageTag = sys.props.get("spark.kubernetes.test.javaImageTag")
val dockerFile = sys.props.getOrElse("spark.kubernetes.test.dockerFile",
s"$sparkHome/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17")
+ val pyDockerFile = sys.props.getOrElse("spark.kubernetes.test.pyDockerFile",
+ s"$bindingsDir/python/Dockerfile")
+ val rDockerFile = sys.props.getOrElse("spark.kubernetes.test.rDockerFile",
+ s"$bindingsDir/R/Dockerfile")
val extraOptions = if (javaImageTag.isDefined) {
Seq("-b", s"java_image_tag=$javaImageTag")
} else {
@@ -654,8 +658,8 @@ object KubernetesIntegrationTests {
val cmd = Seq(dockerTool,
"-r", imageRepo,
"-t", imageTag.getOrElse("dev"),
- "-p", s"$bindingsDir/python/Dockerfile",
- "-R", s"$bindingsDir/R/Dockerfile") ++
+ "-p", pyDockerFile,
+ "-R", rDockerFile) ++
(if (deployMode != Some("minikube")) Seq.empty else Seq("-m")) ++
extraOptions :+
"build"
diff --git a/resource-managers/kubernetes/integration-tests/README.md b/resource-managers/kubernetes/integration-tests/README.md
index 2151b7f..9eb928d 100644
--- a/resource-managers/kubernetes/integration-tests/README.md
+++ b/resource-managers/kubernetes/integration-tests/README.md
@@ -294,3 +294,14 @@ In addition, you can run a single test selectively.
-Dspark.kubernetes.test.deployMode=docker-desktop \
-Dspark.kubernetes.test.imageTag=2022-03-06 \
'kubernetes-integration-tests/testOnly -- -z "Run SparkPi with a very long application name"'
+
+You can also specify your specific dockerfile to build JVM/Python/R based image to test.
+
+ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests \
+ -Dtest.exclude.tags=minikube \
+ -Dspark.kubernetes.test.deployMode=docker-desktop \
+ -Dspark.kubernetes.test.imageTag=2022-03-06 \
+ -Dspark.kubernetes.test.dockerFile=/path/to/Dockerfile \
+ -Dspark.kubernetes.test.pyDockerFile=/path/to/py/Dockerfile \
+ -Dspark.kubernetes.test.rDockerFile=/path/to/r/Dockerfile \
+ 'kubernetes-integration-tests/test'
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org