You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2018/06/27 17:59:00 UTC

[jira] [Resolved] (SPARK-24446) Library path with special characters breaks Spark on YARN

     [ https://issues.apache.org/jira/browse/SPARK-24446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-24446.
------------------------------------
       Resolution: Fixed
         Assignee: Marcelo Vanzin
    Fix Version/s: 2.4.0

> Library path with special characters breaks Spark on YARN
> ---------------------------------------------------------
>
>                 Key: SPARK-24446
>                 URL: https://issues.apache.org/jira/browse/SPARK-24446
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 2.3.0
>            Reporter: Marcelo Vanzin
>            Assignee: Marcelo Vanzin
>            Priority: Minor
>             Fix For: 2.4.0
>
>
> When YARN runs the application's main command, it does it like this:
> {code}
> bash -c "<your command goes here>"
> {code}
> The way Spark injects the library path into that command makes it look like this:
> {code}
> bash -c "LD_LIBRARY_PATH="/foo:/bar:/baz:$LD_LIBRARY_PATH" <rest of the command>"
> {code}
> So that works kinda out of luck, because the concatenation of the strings creates a proper final command... except if you have something like a space or an ampersand in the library path, in which case all containers will fail with a cryptic message like the following:
> {noformat}
> WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Executor for container container_1475411358336_0010_01_000002 exited because of a YARN event (e.g., pre-emption) and not because of an error in the running job.
> {noformat}
> And no useful log output.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org