You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Keunhyun Oh (Jira)" <ji...@apache.org> on 2021/04/15 05:26:00 UTC

[jira] [Created] (SPARK-35084) [k8s] On Spark 3, jars listed in spark.jars and spark.jars.packages are not added to sparkContext

Keunhyun Oh created SPARK-35084:
-----------------------------------

             Summary: [k8s] On Spark 3, jars listed in spark.jars and spark.jars.packages are not added to sparkContext
                 Key: SPARK-35084
                 URL: https://issues.apache.org/jira/browse/SPARK-35084
             Project: Spark
          Issue Type: Question
          Components: Kubernetes
    Affects Versions: 3.1.1, 3.0.2, 3.0.0
            Reporter: Keunhyun Oh


I'm trying to migrate spark 2 to spark 3.

 

In my environment, on Spark 3.x, jars listed in spark.jars and spark.jars.packages are not added to sparkContext.


After driver's process is launched, jars are not propagated to Executors. So, NoClassDefException is raised in executors.

 

In spark.properties, the only main application jar is contained in spark.jars. It is different from Spark 2.

 

How to solve this situation? Is it any changed spark options in spark 3 from spark 2?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org