You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Ash (JIRA)" <ji...@apache.org> on 2014/06/09 09:30:02 UTC

[jira] [Commented] (SPARK-1944) Document --verbose in spark-shell -h

    [ https://issues.apache.org/jira/browse/SPARK-1944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14021726#comment-14021726 ] 

Andrew Ash commented on SPARK-1944:
-----------------------------------

https://github.com/apache/spark/pull/1020

> Document --verbose in spark-shell -h
> ------------------------------------
>
>                 Key: SPARK-1944
>                 URL: https://issues.apache.org/jira/browse/SPARK-1944
>             Project: Spark
>          Issue Type: Documentation
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Andrew Ash
>            Assignee: Andrew Ash
>            Priority: Minor
>
> The below help for spark-submit should make mention of the {{--verbose}} option
> {noformat}
> aash@aash-mbp ~/git/spark$ ./bin/spark-submit -h
> Usage: spark-submit [options] <app jar> [app options]
> Options:
>   --master MASTER_URL         spark://host:port, mesos://host:port, yarn, or local.
>   --deploy-mode DEPLOY_MODE   Mode to deploy the app in, either 'client' or 'cluster'.
>   --class CLASS_NAME          Name of your app's main class (required for Java apps).
>   --arg ARG                   Argument to be passed to your application's main class. This
>                               option can be specified multiple times for multiple args.
>   --name NAME                 The name of your application (Default: 'Spark').
>   --jars JARS                 A comma-separated list of local jars to include on the
>                               driver classpath and that SparkContext.addJar will work
>                               with. Doesn't work on standalone with 'cluster' deploy mode.
>   --files FILES               Comma separated list of files to be placed in the working dir
>                               of each executor.
>   --properties-file FILE      Path to a file from which to load extra properties. If not
>                               specified, this will look for conf/spark-defaults.conf.
>   --driver-memory MEM         Memory for driver (e.g. 1000M, 2G) (Default: 512M).
>   --driver-java-options       Extra Java options to pass to the driver
>   --driver-library-path       Extra library path entries to pass to the driver
>   --driver-class-path         Extra class path entries to pass to the driver. Note that
>                               jars added with --jars are automatically included in the
>                               classpath.
>   --executor-memory MEM       Memory per executor (e.g. 1000M, 2G) (Default: 1G).
>  Spark standalone with cluster deploy mode only:
>   --driver-cores NUM          Cores for driver (Default: 1).
>   --supervise                 If given, restarts the driver on failure.
>  Spark standalone and Mesos only:
>   --total-executor-cores NUM  Total cores for all executors.
>  YARN-only:
>   --executor-cores NUM        Number of cores per executor (Default: 1).
>   --queue QUEUE_NAME          The YARN queue to submit to (Default: 'default').
>   --num-executors NUM         Number of executors to (Default: 2).
>   --archives ARCHIVES         Comma separated list of archives to be extracted into the
>                               working dir of each executor.
> aash@aash-mbp ~/git/spark$
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)