You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Keunhyun Oh (Jira)" <ji...@apache.org> on 2021/04/15 05:32:00 UTC

[jira] [Updated] (SPARK-35084) [k8s] On Spark 3, jars listed in spark.jars and spark.jars.packages are not added to sparkContext

     [ https://issues.apache.org/jira/browse/SPARK-35084?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Keunhyun Oh updated SPARK-35084:
--------------------------------
    Description: 
I'm trying to migrate spark 2 to spark 3 in k8s.

 

In my environment, on Spark 3.x, jars listed in spark.jars and spark.jars.packages are not added to sparkContext.

After driver's process is launched, jars are not propagated to Executors. So, NoClassDefException is raised in executors.

 

In spark.properties, the only main application jar is contained in spark.jars. It is different from Spark 2.

 

How to solve this situation? Is it any changed spark options in spark 3 from spark 2?

  was:
I'm trying to migrate spark 2 to spark 3.

 

In my environment, on Spark 3.x, jars listed in spark.jars and spark.jars.packages are not added to sparkContext.


After driver's process is launched, jars are not propagated to Executors. So, NoClassDefException is raised in executors.

 

In spark.properties, the only main application jar is contained in spark.jars. It is different from Spark 2.

 

How to solve this situation? Is it any changed spark options in spark 3 from spark 2?


> [k8s] On Spark 3, jars listed in spark.jars and spark.jars.packages are not added to sparkContext
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-35084
>                 URL: https://issues.apache.org/jira/browse/SPARK-35084
>             Project: Spark
>          Issue Type: Question
>          Components: Kubernetes
>    Affects Versions: 3.0.0, 3.0.2, 3.1.1
>            Reporter: Keunhyun Oh
>            Priority: Major
>
> I'm trying to migrate spark 2 to spark 3 in k8s.
>  
> In my environment, on Spark 3.x, jars listed in spark.jars and spark.jars.packages are not added to sparkContext.
> After driver's process is launched, jars are not propagated to Executors. So, NoClassDefException is raised in executors.
>  
> In spark.properties, the only main application jar is contained in spark.jars. It is different from Spark 2.
>  
> How to solve this situation? Is it any changed spark options in spark 3 from spark 2?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org