You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:00:26 UTC

[jira] [Updated] (SPARK-22633) spark-submit.cmd cannot handle long arguments

     [ https://issues.apache.org/jira/browse/SPARK-22633?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-22633:
---------------------------------
    Labels: bulk-closed windows  (was: windows)

> spark-submit.cmd cannot handle long arguments
> ---------------------------------------------
>
>                 Key: SPARK-22633
>                 URL: https://issues.apache.org/jira/browse/SPARK-22633
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.1.1
>         Environment: Windows 7 x64
>            Reporter: Olivier Sannier
>            Priority: Major
>              Labels: bulk-closed, windows
>
> Hello,
> Under Windows, one would use spark-submit.cmd with the parameters required to submit a program to Spark which has the following implementation:
> {{cmd /V /E /C "%~dp0spark-submit2.cmd" %*}}
> This spawns a second shell to ensure changes to the environment are local to the script and do not leak to the caller.
> But this has a major drawback as it hits the 2048 characters limit for a cmd.exe argument:
> https://support.microsoft.com/en-us/help/830473/command-prompt-cmd--exe-command-line-string-limitation
> One workaround is to call {{spark-submit2.cmd}} directly but it means a specific command for Windows usage.
> The other solution is to remove the call to {{cmd}} and replace it with a call to {{setlocal}} before calling {{spark-submit2.cmd}} leading to this code:
> {{setlocal}}
> {{"%~dp0spark-submit2.cmd" %*}}
> Using this here solved the issue altogether but I'm not sure it can be applied to older Windows versions.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org