You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 05:35:23 UTC

[jira] [Updated] (SPARK-4160) Standalone cluster mode does not upload all needed jars to driver node

     [ https://issues.apache.org/jira/browse/SPARK-4160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-4160:
--------------------------------
    Labels: bulk-closed  (was: )

> Standalone cluster mode does not upload all needed jars to driver node
> ----------------------------------------------------------------------
>
>                 Key: SPARK-4160
>                 URL: https://issues.apache.org/jira/browse/SPARK-4160
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Marcelo Vanzin
>            Priority: Major
>              Labels: bulk-closed
>
> If you look at the code in {{DriverRunner.scala}}, there is code to download the main application jar from the launcher node. But that's the only jar that's downloaded - if the driver depends on one of the jars or files specified via {{spark-submit --jars <list> --files <list>}}, it won't be able to run.
> It should be possible to use the same mechanism to distribute the other files to the driver node, even if that's not the most efficient way of doing it. That way, at least, you don't need any external dependencies to be able to distribute the files.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org