You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/12/30 10:57:13 UTC

[jira] [Commented] (SPARK-5005) Failed to start spark-shell when using yarn-client mode with the Spark1.2.0

    [ https://issues.apache.org/jira/browse/SPARK-5005?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14260955#comment-14260955 ] 

Sean Owen commented on SPARK-5005:
----------------------------------

You should use {{--master yarn-client}} not {{MASTER=yarn-client}}.

> Failed to start spark-shell when using  yarn-client mode with the Spark1.2.0
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-5005
>                 URL: https://issues.apache.org/jira/browse/SPARK-5005
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, Spark Shell, YARN
>    Affects Versions: 1.2.0
>         Environment: Spark 1.2.0
> Hadoop 2.2.0
>            Reporter: yangping wu
>            Priority: Minor
>   Original Estimate: 8h
>  Remaining Estimate: 8h
>
> I am using Spark 1.2.0, but when I starting spark-shell with yarn-client mode({code}MASTER=yarn-client bin/spark-shell{code}), It Failed and the error message is
> {code}
> Unknown/unsupported param List(--executor-memory, 1024m, --executor-cores, 8, --num-executors, 2)
> Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options] 
> Options:
>   --jar JAR_PATH       Path to your application's JAR file (required)
>   --class CLASS_NAME   Name of your application's main class (required)
>   --args ARGS          Arguments to be passed to your application's main class.
>                        Mutliple invocations are possible, each will be passed in order.
>   --num-executors NUM    Number of executors to start (Default: 2)
>   --executor-cores NUM   Number of cores for the executors (Default: 1)
>   --executor-memory MEM  Memory per executor (e.g. 1000M, 2G) (Default: 1G)
> {code}
> But when I using Spark 1.1.0,and also using {code}MASTER=yarn-client bin/spark-shell{code} to starting spark-shell,it works.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org