You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/11/08 02:11:58 UTC

[jira] [Updated] (SPARK-18340) Inconsistent error messages in launching scripts and hanging in sparkr script for wrong options

     [ https://issues.apache.org/jira/browse/SPARK-18340?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-18340:
---------------------------------
    Description: 
It seems there are some problems with handling wrong options as below:

*{{spark-submit}} script - this one looks fine

{code}
spark-submit --aabbcc
Error: Unrecognized option: --aabbcc

Usage: spark-submit [options] <app jar | python file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]

Options:
  --master MASTER_URL         spark://host:port, mesos://host:port, yarn, or local.
  --deploy-mode DEPLOY_MODE   Whether to launch the driver program locally ("client") or
                              on one of the worker machines inside the cluster ("cluster")
                              (Default: client).
  --class CLASS_NAME          Your application's main class (for Java / Scala apps).
  --name NAME                 A name of your application.
...
{code}


*{{spark-sql}} script - this one looks fine

{code}
spark-sql --aabbcc
Unrecognized option: --aabbcc
usage: hive
 -d,--define <key=value>          Variable subsitution to apply to hive
                                  commands. e.g. -d A=B or --define A=B
    --database <databasename>     Specify the database to use
...
{code}

*{{sparkr}} script - this one might be a bit major because users possibly mistakenly put some wrong options with typos and the error message does not indicate the options are wrong.

{code}
sparkr --aabbcc

...

Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
  JVM is not ready after 10 seconds
>
{code}

*{{pyspark}} script - we could make the error message consistently with the others

{code}
pyspark --aabbcc
Exception in thread "main" java.lang.IllegalArgumentException: pyspark does not support any application options.
	at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:241)
	at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:290)
	at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:147)
	at org.apache.spark.launcher.Main.main(Main.java:86)
{code}

*{{spark-shell}} - it seems the error message is not pretty kind like {{spark-submit}} or {{spark-sql}}.

{code}
spark-shell --aabbcc
bad option: '--aabbcc'
{code}


  was:
It seems there are some problems with handling wrong options as below:

*{{spark-submit}} script - this one looks fine

{code}
spark-submit --aabbcc
Error: Unrecognized option: --aabbcc

Usage: spark-submit [options] <app jar | python file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]

Options:
  --master MASTER_URL         spark://host:port, mesos://host:port, yarn, or local.
  --deploy-mode DEPLOY_MODE   Whether to launch the driver program locally ("client") or
                              on one of the worker machines inside the cluster ("cluster")
                              (Default: client).
  --class CLASS_NAME          Your application's main class (for Java / Scala apps).
  --name NAME                 A name of your application.
...
{code}


*{{spark-sql}} script - this one looks fine

{code}
spark-sql --aabbcc
Unrecognized option: --aabbcc
usage: hive
 -d,--define <key=value>          Variable subsitution to apply to hive
                                  commands. e.g. -d A=B or --define A=B
    --database <databasename>     Specify the database to use
...
{code}

*{{sparkr}} script - this one might be a bit major because users possibly mistakenly put some wrong options with typos and the error message does not indicate the options are wrong.

{code}
sparkr --aabbcc

...

Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
  JVM is not ready after 10 seconds
>
{code}

*{{pyspark}} script - we could make the error message consistently with the others

{code}
pyspark --aabbcc
Exception in thread "main" java.lang.IllegalArgumentException: pyspark does not support any application options.
	at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:241)
	at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:290)
	at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:147)
	at org.apache.spark.launcher.Main.main(Main.java:86)
{code}



> Inconsistent error messages in launching scripts and hanging in sparkr script for wrong options
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-18340
>                 URL: https://issues.apache.org/jira/browse/SPARK-18340
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, Spark Submit
>            Reporter: Hyukjin Kwon
>            Priority: Minor
>
> It seems there are some problems with handling wrong options as below:
> *{{spark-submit}} script - this one looks fine
> {code}
> spark-submit --aabbcc
> Error: Unrecognized option: --aabbcc
> Usage: spark-submit [options] <app jar | python file> [app arguments]
> Usage: spark-submit --kill [submission ID] --master [spark://...]
> Usage: spark-submit --status [submission ID] --master [spark://...]
> Usage: spark-submit run-example [options] example-class [example args]
> Options:
>   --master MASTER_URL         spark://host:port, mesos://host:port, yarn, or local.
>   --deploy-mode DEPLOY_MODE   Whether to launch the driver program locally ("client") or
>                               on one of the worker machines inside the cluster ("cluster")
>                               (Default: client).
>   --class CLASS_NAME          Your application's main class (for Java / Scala apps).
>   --name NAME                 A name of your application.
> ...
> {code}
> *{{spark-sql}} script - this one looks fine
> {code}
> spark-sql --aabbcc
> Unrecognized option: --aabbcc
> usage: hive
>  -d,--define <key=value>          Variable subsitution to apply to hive
>                                   commands. e.g. -d A=B or --define A=B
>     --database <databasename>     Specify the database to use
> ...
> {code}
> *{{sparkr}} script - this one might be a bit major because users possibly mistakenly put some wrong options with typos and the error message does not indicate the options are wrong.
> {code}
> sparkr --aabbcc
> ...
> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap,  :
>   JVM is not ready after 10 seconds
> >
> {code}
> *{{pyspark}} script - we could make the error message consistently with the others
> {code}
> pyspark --aabbcc
> Exception in thread "main" java.lang.IllegalArgumentException: pyspark does not support any application options.
> 	at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:241)
> 	at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:290)
> 	at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:147)
> 	at org.apache.spark.launcher.Main.main(Main.java:86)
> {code}
> *{{spark-shell}} - it seems the error message is not pretty kind like {{spark-submit}} or {{spark-sql}}.
> {code}
> spark-shell --aabbcc
> bad option: '--aabbcc'
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org