You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Samuel Suter (Jira)" <ji...@apache.org> on 2020/08/05 13:19:00 UTC

[jira] [Created] (SPARK-32545) Applications with Maven dependencies fail when submitted in cluster mode on K8s

Samuel Suter created SPARK-32545:
------------------------------------

             Summary: Applications with Maven dependencies fail when submitted in cluster mode on K8s
                 Key: SPARK-32545
                 URL: https://issues.apache.org/jira/browse/SPARK-32545
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.0
            Reporter: Samuel Suter




A Spark application with Maven dependencies (specified with {{--packages}}) submitted in cluster mode on Kubernetes fails with a {{ClassNotFoundException}} because the executors do not have the dependencies installed.

I think this has been introduced by SPARK-23153 (in SparkSubmit.scala). Now the executors are not aware that they should request the resolved Maven dependencies from the driver. 

Sample {{spark-submit}}:
{code:bash}
spark-submit \
    --deploy-mode cluster \
    --master k8s://https://kubernetes.docker.internal:6443 \
    --conf spark.kubernetes.container.image=some_spark3_image \
    --conf spark.kubernetes.driver.container.image=some_spark3_image \
    --conf spark.driver.cores=1 \
    --conf spark.driver.memory=512m \
    --conf spark.executor.cores=1 \
    --conf spark.executor.memory=512m \
    --conf spark.executor.instances=1 \
    --conf spark.jars.packages=org:artifact_2.12:1.0.0 \
    http://hostname/app_2.12-0.1.0.jar
{code}

If {{app}} imports any code from {{org.artifact}} the executors will fail with a {{ClassNotFoundException}}




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org