You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Baswaraj (JIRA)" <ji...@apache.org> on 2015/06/25 08:16:04 UTC

[jira] [Updated] (SPARK-8622) Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath

     [ https://issues.apache.org/jira/browse/SPARK-8622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Baswaraj updated SPARK-8622:
----------------------------
    Summary: Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath  (was: Spark 1.3.1 and 1.4.0 doesn't put executor working directory on execitor classpath)

> Spark 1.3.1 and 1.4.0 doesn't put executor working directory on executor classpath
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-8622
>                 URL: https://issues.apache.org/jira/browse/SPARK-8622
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Baswaraj
>
> I ran into an issue that executor not able to pickup my configs/ function from my custom jar in standalone (client/cluster) deploy mode. I have used spark-submit --Jar option to specify all my jars and configs to be used by executors.
> all these files are placed in working directory of executor, but not in executor classpath.  Also, executor working directory is not in executor classpath.
> I am expecting executor to find all files in spark-submit --jar options to be available.
> in spark 1.3.0 executor working directory is in executor classpath.
> To successfully run my application with spark 1.3.1 +, i have to add following entry in slaves conf/spark-defaults.conf
> spark.executor.extraClassPath   .
> Please advice.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org