You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/05 10:12:06 UTC

[jira] [Updated] (SPARK-7310) SparkSubmit does not escape & for java options and ^& won't work

     [ https://issues.apache.org/jira/browse/SPARK-7310?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-7310:
-----------------------------
          Component/s: Spark Submit
             Priority: Minor  (was: Major)
    Affects Version/s: 1.4.0
                       1.3.1

> SparkSubmit does not escape & for java options and ^& won't work
> ----------------------------------------------------------------
>
>                 Key: SPARK-7310
>                 URL: https://issues.apache.org/jira/browse/SPARK-7310
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.3.1, 1.4.0
>            Reporter: Yitong Zhou
>            Priority: Minor
>
> I can create the error when doing something like:
> {code}
> LIBJARS= /jars.../
> bin/spark-submit \
>  --driver-java-options "-Djob.url=http://www.foo.bar?query=a&b" \
>  --class com.example.Class \
>  --master yarn-cluster \
>  --num-executors 3 \
>  --executor-cores 1 \
>  --queue default \
>  --driver-memory 1g \
>  --executor-memory 1g \
>  --jars $LIBJARS\
>  ../a.jar \
>  -inputPath /user/yizhou/CED-scoring/input \
>  -outputPath /user/yizhou
> {code}
> Notice that if I remove the "&" in "--driver-java-options" value, then the submit will succeed. A typical error message looks like this:
> {code}
> org.apache.hadoop.util.Shell$ExitCodeException: Usage: java [-options] class [args...]
>            (to execute a class)
>    or  java [-options] -jar jarfile [args...]
>            (to execute a jar file)
> where options include:
>     -d32	  use a 32-bit data model if available
>     -d64	  use a 64-bit data model if available
>     -server	  to select the "server" VM
>                   The default VM is server,
>                   because you are running on a server-class machine.
>     -cp <class search path of directories and zip/jar files>
>     -classpath <class search path of directories and zip/jar files>
>                   A : separated list of directories, JAR archives,
>                   and ZIP archives to search for class files.
>     -D<name>=<value>
>                   set a system property
>     -verbose:[class|gc|jni]
>                   enable verbose output
>     -version      print product version and exit
>     -version:<value>
>                   require the specified version to run
>     -showversion  print product version and continue
>     -jre-restrict-search | -no-jre-restrict-search
>                   include/exclude user private JREs in the version search
>     -? -help      print this help message
>     -X            print help on non-standard options
>     -ea[:<packagename>...|:<classname>]
>     -enableassertions[:<packagename>...|:<classname>]
>                   enable assertions with specified granularity
>     -da[:<packagename>...|:<classname>]
>     -disableassertions[:<packagename>...|:<classname>]
>                   disable assertions with specified granularity
>     -esa | -enablesystemassertions
>                   enable system assertions
>     -dsa | -disablesystemassertions
>                   disable system assertions
>     -agentlib:<libname>[=<options>]
>                   load native agent library <libname>, e.g. -agentlib:hprof
>                   see also, -agentlib:jdwp=help and -agentlib:hprof=help
>     -agentpath:<pathname>[=<options>]
>                   load native agent library by full pathname
>     -javaagent:<jarpath>[=<options>]
>                   load Java programming language agent, see java.lang.instrument
>     -splash:<imagepath>
>                   show splash screen with specified image
> See http://www.oracle.com/technetwork/java/javase/documentation/index.html for more details.
> 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
> 	at org.apache.hadoop.util.Shell.run(Shell.java:418)
> 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
> 	at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:279)
> 	at org.apache.hadoop.yarn.server.nodemanager.PepperdataContainerExecutor.launchContainer(PepperdataContainerExecutor.java:130)
> 	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
> 	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org