You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2014/03/30 05:27:14 UTC

[jira] [Commented] (SPARK-1186) Enrich the Spark Shell to support additional arguments.

    [ https://issues.apache.org/jira/browse/SPARK-1186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13954559#comment-13954559 ] 

ASF GitHub Bot commented on SPARK-1186:
---------------------------------------

Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/116


> Enrich the Spark Shell to support additional arguments.
> -------------------------------------------------------
>
>                 Key: SPARK-1186
>                 URL: https://issues.apache.org/jira/browse/SPARK-1186
>             Project: Apache Spark
>          Issue Type: Improvement
>    Affects Versions: 0.9.0
>            Reporter: Bernardo Gomez Palacio
>
> Enrich the Spark Shell functionality to support the following options.
> {code:title=spark-shell.sh|borderStyle=solid}
> Usage: spark-shell [OPTIONS]
> OPTIONS:
>     -h  --help                           : Print this help information.
>     -c  --cores                         : The maximum number of cores to be used by the Spark Shell.
>     -em --executor-memory    : The memory used by each executor of the Spark Shell, the number
>                                                 is followed by m for megabytes or g for gigabytes, e.g. "1g".
>     -dm --driver-memory         : The memory used by the Spark Shell, the number is followed
>                                                 by m for megabytes or g for gigabytes, e.g. "1g".
>     -m  --master                       : A full string that describes the Spark Master, defaults to "local"
>                                                  e.g. "spark://localhost:7077".
>     --log-conf                           : Enables logging of the supplied SparkConf as INFO at start of the
>                                                 Spark Context.
> e.g.
>     spark-shell -m spark://localhost:7077 -c 4 -dm 512m -em 2g
> {code}
> **Note**: the options described above are not visually aligned due JIRA's rendering, in the bash CLI they are.



--
This message was sent by Atlassian JIRA
(v6.2#6252)