You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gurpreet Singh (JIRA)" <ji...@apache.org> on 2014/12/23 07:21:13 UTC

[jira] [Commented] (SPARK-4160) Standalone cluster mode does not upload all needed jars to driver node

    [ https://issues.apache.org/jira/browse/SPARK-4160?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14256653#comment-14256653 ] 

Gurpreet Singh commented on SPARK-4160:
---------------------------------------

Looks like this bug is relevant to yarn cluster mode also. Spark-Submit is not copying jars/files specified in --jars and --files option, This is working in 1.0.2 version. 

> Standalone cluster mode does not upload all needed jars to driver node
> ----------------------------------------------------------------------
>
>                 Key: SPARK-4160
>                 URL: https://issues.apache.org/jira/browse/SPARK-4160
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Marcelo Vanzin
>
> If you look at the code in {{DriverRunner.scala}}, there is code to download the main application jar from the launcher node. But that's the only jar that's downloaded - if the driver depends on one of the jars or files specified via {{spark-submit --jars <list> --files <list>}}, it won't be able to run.
> It should be possible to use the same mechanism to distribute the other files to the driver node, even if that's not the most efficient way of doing it. That way, at least, you don't need any external dependencies to be able to distribute the files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org