You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2014/09/03 15:37:51 UTC

[jira] [Commented] (SPARK-2718) YARN does not handle spark configs with quotes or backslashes

    [ https://issues.apache.org/jira/browse/SPARK-2718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14119854#comment-14119854 ] 

Thomas Graves commented on SPARK-2718:
--------------------------------------

[~andrewor]  the pr for this is closed, can we close the jira also?

> YARN does not handle spark configs with quotes or backslashes
> -------------------------------------------------------------
>
>                 Key: SPARK-2718
>                 URL: https://issues.apache.org/jira/browse/SPARK-2718
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.0.2
>            Reporter: Andrew Or
>            Assignee: Marcelo Vanzin
>            Priority: Minor
>             Fix For: 1.1.0
>
>
> Say we have the following config:
> {code}
> spark.app.name spark shell with spaces and "quotes " and \ backslashes \
> {code}
> This works in standalone mode but not in YARN mode. This is because standalone mode uses Java's ProcessBuilder, which handles these cases nicely, but YARN mode uses org.apache.hadoop.yarn.api.records.ContainerLaunchContext, which does not. As a result, submitting an application to YARN with the given config leads to the following exception:
> {code}
> line 0: unexpected EOF while looking for matching `"'
> syntax error: unexpected end of file
>   at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
>   at org.apache.hadoop.util.Shell.run(Shell.java:418)
>   at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
>   ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org