You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rob Vesse (JIRA)" <ji...@apache.org> on 2019/01/22 11:06:00 UTC

[jira] [Created] (SPARK-26687) Building Spark Images has non-intuitive behaviour with paths to custom Dockerfiles

Rob Vesse created SPARK-26687:
---------------------------------

             Summary: Building Spark Images has non-intuitive behaviour with paths to custom Dockerfiles
                 Key: SPARK-26687
                 URL: https://issues.apache.org/jira/browse/SPARK-26687
             Project: Spark
          Issue Type: Improvement
          Components: Kubernetes
    Affects Versions: 2.4.0
            Reporter: Rob Vesse


With the changes from SPARK-26025 (https://github.com/apache/spark/pull/23019) we use a pared down Docker build context which significantly improves build times.  However the way this is implemented leads to non-intuitive behaviour when supplying custom Docker file paths.  This is because of the following code snippets:

{code}
(cd $(img_ctx_dir base) && docker build $NOCACHEARG "${BUILD_ARGS[@]}" \
    -t $(image_ref spark) \
    -f "$BASEDOCKERFILE" .)
{code}

Since the script changes to the temporary build context directory and then runs {{docker build}} there any path given for the Docker file is taken as relative to the temporary build context directory rather than to the directory where the user invoked the script.  This produces somewhat unhelpful errors e.g.

{noformat}
> ./bin/docker-image-tool.sh -r rvesse -t badpath -p resource-managers/kubernetes/docker/src/main/dockerfiles/spark/bindings/python/Dockerfile build
Sending build context to Docker daemon  218.4MB
Step 1/15 : FROM openjdk:8-alpine
 ---> 5801f7d008e5
Step 2/15 : ARG spark_uid=185
 ---> Using cache
 ---> 5fd63df1ca39
...
Successfully tagged rvesse/spark:badpath
unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /Users/rvesse/Documents/Work/Code/spark/target/tmp/docker/pyspark/resource-managers: no such file or directory
Failed to build PySpark Docker image, please refer to Docker build output for details.
{noformat}

Here we can see that the relative path that was valid where the user typed the command was not valid inside the build context directory.

To resolve this we need to ensure that we are resolving relative paths to Docker files appropriately.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org