You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/13 21:18:00 UTC

[jira] [Resolved] (SPARK-8622) Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath

     [ https://issues.apache.org/jira/browse/SPARK-8622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-8622.
-----------------------------------
    Resolution: Not A Problem

This works as designed. {{--jars}} are added to the Spark class loader (which is different than the system classpath).

> Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-8622
>                 URL: https://issues.apache.org/jira/browse/SPARK-8622
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Baswaraj
>            Priority: Major
>
> I ran into an issue that executor not able to pickup my configs/ function from my custom jar in standalone (client/cluster) deploy mode. I have used spark-submit --Jar option to specify all my jars and configs to be used by executors.
> all these files are placed in working directory of executor, but not in executor classpath.  Also, executor working directory is not in executor classpath.
> I am expecting executor to find all files specified in spark-submit --jar options .
> In spark 1.3.0 executor working directory is in executor classpath, so app runs successfully.
> To successfully run my application with spark 1.3.1 +, i have to use  following option  (conf/spark-defaults.conf)
> spark.executor.extraClassPath   .
> Please advice.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org