You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cory Maklin (Jira)" <ji...@apache.org> on 2021/03/25 15:52:00 UTC

[jira] [Updated] (SPARK-34870) Jars downloaded with the --packages argument are not added to the classpath for executors.

     [ https://issues.apache.org/jira/browse/SPARK-34870?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Cory Maklin updated SPARK-34870:
--------------------------------
    Description: 
When Spark is run in local mode, it works as expected. However, when Spark is run in client mode, it copies the jars to the executor ($SPARK_HOME/work/<app id>/<executor id>), but never adds them to the classpath.

It might be worth noting that `spark.jars` does add the jars to the classpath, but unlike `spark.jars.packages` it doesn't automatically download the jar's compiled dependencies.

 

```

spark = SparkSession.builder\
 .master(SPARK_MASTER)\
 .appName(APP_NAME)\
 ...
 .config("spark.jars.packages", DEPENDENCY_PACKAGES) \

...
 .getOrCreate()
 ```

 

  was:
When Spark is run in local mode, it works as expected. However, when Spark is run in client mode, it copies the jars to the executor ($SPARK_HOME/work/<app id>/<executor id>), but never adds them to the classpath.

It might be worth noting that `spark.jars` does add the jars to the classpath, but unlike `spark.jars.packages` it doesn't automatically download the jar's compiled dependencies.

 

```

spark = SparkSession.builder\
 .master(SPARK_MASTER)\
 .appName(APP_NAME)\
...
 .config("spark.jars.packages", DEPENDENCY_PACKAGES) \

...
 .getOrCreate()
```

 


> Jars downloaded with the --packages argument are not added to the classpath for executors.
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-34870
>                 URL: https://issues.apache.org/jira/browse/SPARK-34870
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 3.0.0
>         Environment: Spark worker running inside a Kubernetes pod with a Bitnami Spark image, and the driver running inside of a Jupyter Spark Kubernetes pod.
>            Reporter: Cory Maklin
>            Priority: Major
>
> When Spark is run in local mode, it works as expected. However, when Spark is run in client mode, it copies the jars to the executor ($SPARK_HOME/work/<app id>/<executor id>), but never adds them to the classpath.
> It might be worth noting that `spark.jars` does add the jars to the classpath, but unlike `spark.jars.packages` it doesn't automatically download the jar's compiled dependencies.
>  
> ```
> spark = SparkSession.builder\
>  .master(SPARK_MASTER)\
>  .appName(APP_NAME)\
>  ...
>  .config("spark.jars.packages", DEPENDENCY_PACKAGES) \
> ...
>  .getOrCreate()
>  ```
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org