You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/03/08 18:29:55 UTC

[GitHub] [spark] vanzin commented on a change in pull request #23380: [SPARK-26343][KUBERNETES] Try to speed up running local k8s integration tests

vanzin commented on a change in pull request #23380: [SPARK-26343][KUBERNETES] Try to speed up running local k8s integration tests
URL: https://github.com/apache/spark/pull/23380#discussion_r263880492
 
 

 ##########
 File path: resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ##########
 @@ -58,50 +59,59 @@ while (( "$#" )); do
   shift
 done
 
-if [[ $SPARK_TGZ == "N/A" ]];
+rm -rf "$UNPACKED_SPARK_TGZ"
+if [[ $SPARK_TGZ == "N/A" && $IMAGE_TAG == "N/A" ]];
 then
-  echo "Must specify a Spark tarball to build Docker images against with --spark-tgz." && exit 1;
+  # If there is no spark image tag to test with and no src dir, build from current
+  SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
+  SPARK_INPUT_DIR="$(cd "$SCRIPT_DIR/"../../../../  >/dev/null 2>&1 && pwd )"
+  DOCKER_FILE_BASE_PATH="$SPARK_INPUT_DIR/resource-managers/kubernetes/docker/src/main/dockerfiles/spark"
+elif [[ $IMAGE_TAG == "N/A" ]];
+then
+  # If there is a test src tarball and no image tag we will want to build from that
+  mkdir -p $UNPACKED_SPARK_TGZ
+  tar -xzvf $SPARK_TGZ --strip-components=1 -C $UNPACKED_SPARK_TGZ;
+  SPARK_INPUT_DIR="$UNPACKED_SPARK_TGZ"
+  DOCKER_FILE_BASE_PATH="$SPARK_INPUT_DIR/kubernetes/dockerfiles/spark"
 fi
 
-rm -rf $UNPACKED_SPARK_TGZ
-mkdir -p $UNPACKED_SPARK_TGZ
-tar -xzvf $SPARK_TGZ --strip-components=1 -C $UNPACKED_SPARK_TGZ;
 
+# If there is a specific Spark image skip building and extraction/copy
 if [[ $IMAGE_TAG == "N/A" ]];
 then
   IMAGE_TAG=$(uuidgen);
-  cd $UNPACKED_SPARK_TGZ
+  cd $SPARK_INPUT_DIR
 
   # Build PySpark image
-  LANGUAGE_BINDING_BUILD_ARGS="-p $UNPACKED_SPARK_TGZ/kubernetes/dockerfiles/spark/bindings/python/Dockerfile"
+  LANGUAGE_BINDING_BUILD_ARGS="-p $DOCKER_FILE_BASE_PATH/bindings/python/Dockerfile"
 
   # Build SparkR image
-  LANGUAGE_BINDING_BUILD_ARGS="$LANGUAGE_BINDING_BUILD_ARGS -R $UNPACKED_SPARK_TGZ/kubernetes/dockerfiles/spark/bindings/R/Dockerfile"
+  LANGUAGE_BINDING_BUILD_ARGS="$LANGUAGE_BINDING_BUILD_ARGS -R $DOCKER_FILE_BASE_PATH/bindings/R/Dockerfile"
 
 Review comment:
   If you want to enable `-e` I'm pretty sure you can just comment this out. The R integration tests are disabled currently because they don't work for some reason (SPARK-25152).

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org