You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/07/29 06:34:00 UTC

[jira] [Created] (SPARK-28550) Unset SPARK_HOME environment variable in K8S integration preparation

Hyukjin Kwon created SPARK-28550:
------------------------------------

             Summary: Unset SPARK_HOME environment variable in K8S integration preparation
                 Key: SPARK-28550
                 URL: https://issues.apache.org/jira/browse/SPARK-28550
             Project: Spark
          Issue Type: Test
          Components: Kubernetes, Tests
    Affects Versions: 3.0.0
            Reporter: Hyukjin Kwon


Currently, if we run the Kubernetes Integration Tests with SPARK_HOME already set, it refers the SPARK_HOME even when {{--spark-tgz}} is specified:

{code}
export SPARK_HOME=`pwd`
dev/make-distribution.sh --pip --tgz -Phadoop-2.7 -Pkubernetes
resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh --deploy-mode docker-for-desktop --spark-tgz $PWD/spark-*.tgz
{code}

{code}
+ /.../spark/resource-managers/kubernetes/integration-tests/target/spark-dist-unpacked/bin/docker-image-tool.sh -r docker.io/kubespark -t 650B51C8-BBED-47C9-AEAB-E66FC9A0E64E -p /.../spark/resource-managers/kubernetes/integration-tests/target/spark-dist-unpacked/kubernetes/dockerfiles/spark/bindings/python/Dockerfile build
cp: resource-managers/kubernetes/docker/src/main/dockerfiles: No such file or directory
cp: assembly/target/scala-2.12/jars: No such file or directory
cp: resource-managers/kubernetes/integration-tests/tests: No such file or directory
cp: examples/target/scala-2.12/jars/*: No such file or directory
cp: resource-managers/kubernetes/docker/src/main/dockerfiles: No such file or directory
cp: resource-managers/kubernetes/docker/src/main/dockerfiles: No such file or directory
{code}




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org