You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:04:33 UTC

[jira] [Updated] (SPARK-20016) SparkLauncher submit job failed after setConf with special charaters under windows

     [ https://issues.apache.org/jira/browse/SPARK-20016?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-20016:
---------------------------------
    Labels: bulk-closed  (was: )

> SparkLauncher submit job failed after setConf with special charaters under windows
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-20016
>                 URL: https://issues.apache.org/jira/browse/SPARK-20016
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 2.0.0
>         Environment: windows 7, 8, 10, 2008, 2008R2, etc.
>            Reporter: Vincent Sun
>            Priority: Major
>              Labels: bulk-closed
>
> I am using sparkLauncher JAVA API to submit job to a remote spark cluster master.  Codes looks like follow:
> /*
> * launch Job
> */
> public static void launch() throws Exception {
>     SparkLauncher spark = new SparkLauncher();      spark.setAppName("sparkdemo").setAppResource("hdfs://10.250.1.121:9000/application.jar").setMainClass("test.Application");
>     spark.setMaster(spark://10.250.1.120:6066);
>     spark.setDeployMode("cluster");
>     spark.setConf("spark.executor.cores","2") 
>     spark.setConf("spark.executor.memory","8G") 
>     spark.startApplication(new MyAppListener(job.getAppName()));
>   }
> It works fine under Linux/CentOS, but failed on my own desktop which is a windows 8 OS. It will throw out error:
> <b>[launcher-proc-1] The filename, directory name, or volume label syntax is incorrect.</b>
> The finial command I caught is this:
> spark-submit.cmd  --master spark://10.250.1.120:6066 --deploy-mode cluster --name sparkdemo --conf "spark.executor.memory=8G" --conf "spark.executor.cores=2"  --class test.Application hdfs://10.250.1.121:9000/application.jar
> The quote on spark.executor.memory=8G and spark.executor.cores=2 cause the exception.
> After debug into the source code I found the reason is at:
> quoteForBatchScript method of CommandBuilderUtils class
> It will add quotes while there is '=' or some other kinds of special characters under windows system. Here is the source codes:
> static String quoteForBatchScript(String arg) {
>     boolean needsQuotes = false;
>     for (int i = 0; i < arg.length(); i++) {
>       int c = arg.codePointAt(i);
>       if (Character.isWhitespace(c) || c == '"' || c == '=' || c == ',' || c == ';') {
>         needsQuotes = true;
>         break;
>       }
>     }
>     if (!needsQuotes) {
>       return arg;
>     }
>     StringBuilder quoted = new StringBuilder();
>     quoted.append("\"");
>     for (int i = 0; i < arg.length(); i++) {
>       int cp = arg.codePointAt(i);
>       switch (cp) {
>       case '"':
>         quoted.append('"');
>         break;
>       default:
>         break;
>       }
>       quoted.appendCodePoint(cp);
>     }
>     if (arg.codePointAt(arg.length() - 1) == '\\') {
>       quoted.append("\\");
>     }
>     quoted.append("\"");
>     return quoted.toString();
>   }



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org