You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by nitinkak001 <ni...@gmail.com> on 2015/03/23 16:49:29 UTC

Is yarn-standalone mode deprecated?

Is yarn-standalone mode deprecated in Spark now. The reason I am asking is
because while I can find it in 0.9.0
documentation(https://spark.apache.org/docs/0.9.0/running-on-yarn.html). I
am not able to find it in 1.2.0. 

I am using this mode to run the Spark jobs from Oozie as a java action.
Removing this mode will prevent me from doing that. Are there any other ways
of running a Spark job from Oozie other than Shell action? 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-yarn-standalone-mode-deprecated-tp22188.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Is yarn-standalone mode deprecated?

Posted by Sandy Ryza <sa...@cloudera.com>.
I checked and apparently it hasn't be released yet.  it will be available
in the upcoming CDH 5.4 release.

-Sandy

On Mon, Mar 23, 2015 at 1:32 PM, Nitin kak <ni...@gmail.com> wrote:

> I know there was an effort for this, do you know which version of Cloudera
> distribution we could find that?
>
> On Mon, Mar 23, 2015 at 1:13 PM, Sandy Ryza <sa...@cloudera.com>
> wrote:
>
>> The former is deprecated.  However, the latter is functionally equivalent
>> to it.  Both launch an app in what is now called "yarn-cluster" mode.
>>
>> Oozie now also has a native Spark action, though I'm not familiar on the
>> specifics.
>>
>> -Sandy
>>
>> On Mon, Mar 23, 2015 at 1:01 PM, Nitin kak <ni...@gmail.com> wrote:
>>
>>> To be more clear, I am talking about
>>>
>>> SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./bin/spark-class org.apache.spark.deploy.yarn.Client \
>>>   --jar <YOUR_APP_JAR_FILE> \
>>>   --class <APP_MAIN_CLASS> \
>>>   --args <APP_MAIN_ARGUMENTS> \
>>>   --num-workers <NUMBER_OF_WORKER_MACHINES> \
>>>   --master-class <ApplicationMaster_CLASS>
>>>   --master-memory <MEMORY_FOR_MASTER> \
>>>   --worker-memory <MEMORY_PER_WORKER> \
>>>   --worker-cores <CORES_PER_WORKER> \
>>>   --name <application_name> \
>>>   --queue <queue_name> \
>>>   --addJars <any_local_files_used_in_SparkContext.addJar> \
>>>   --files <files_for_distributed_cache> \
>>>   --archives <archives_for_distributed_cache>
>>>
>>> which I thought was the yarn-standalone mode
>>>
>>> vs
>>>
>>> spark-submit
>>>
>>> ./bin/spark-submit --class org.apache.spark.examples.SparkPi \
>>>     --master yarn-cluster \
>>>     --num-executors 3 \
>>>     --driver-memory 4g \
>>>     --executor-memory 2g \
>>>     --executor-cores 1 \
>>>     --queue thequeue \
>>>     lib/spark-examples*.jar
>>>
>>>
>>> I didnt see example of ./bin/spark-class in 1.2.0 documentation, so am
>>> wondering if that is deprecated.
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Mar 23, 2015 at 12:11 PM, Sandy Ryza <sa...@cloudera.com>
>>> wrote:
>>>
>>>> The mode is not deprecated, but the name "yarn-standalone" is now
>>>> deprecated.  It's now referred to as "yarn-cluster".
>>>>
>>>> -Sandy
>>>>
>>>> On Mon, Mar 23, 2015 at 11:49 AM, nitinkak001 <ni...@gmail.com>
>>>> wrote:
>>>>
>>>>> Is yarn-standalone mode deprecated in Spark now. The reason I am
>>>>> asking is
>>>>> because while I can find it in 0.9.0
>>>>> documentation(https://spark.apache.org/docs/0.9.0/running-on-yarn.html).
>>>>> I
>>>>> am not able to find it in 1.2.0.
>>>>>
>>>>> I am using this mode to run the Spark jobs from Oozie as a java action.
>>>>> Removing this mode will prevent me from doing that. Are there any
>>>>> other ways
>>>>> of running a Spark job from Oozie other than Shell action?
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Is-yarn-standalone-mode-deprecated-tp22188.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Is yarn-standalone mode deprecated?

Posted by Sandy Ryza <sa...@cloudera.com>.
The former is deprecated.  However, the latter is functionally equivalent
to it.  Both launch an app in what is now called "yarn-cluster" mode.

Oozie now also has a native Spark action, though I'm not familiar on the
specifics.

-Sandy

On Mon, Mar 23, 2015 at 1:01 PM, Nitin kak <ni...@gmail.com> wrote:

> To be more clear, I am talking about
>
> SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./bin/spark-class org.apache.spark.deploy.yarn.Client \
>   --jar <YOUR_APP_JAR_FILE> \
>   --class <APP_MAIN_CLASS> \
>   --args <APP_MAIN_ARGUMENTS> \
>   --num-workers <NUMBER_OF_WORKER_MACHINES> \
>   --master-class <ApplicationMaster_CLASS>
>   --master-memory <MEMORY_FOR_MASTER> \
>   --worker-memory <MEMORY_PER_WORKER> \
>   --worker-cores <CORES_PER_WORKER> \
>   --name <application_name> \
>   --queue <queue_name> \
>   --addJars <any_local_files_used_in_SparkContext.addJar> \
>   --files <files_for_distributed_cache> \
>   --archives <archives_for_distributed_cache>
>
> which I thought was the yarn-standalone mode
>
> vs
>
> spark-submit
>
> ./bin/spark-submit --class org.apache.spark.examples.SparkPi \
>     --master yarn-cluster \
>     --num-executors 3 \
>     --driver-memory 4g \
>     --executor-memory 2g \
>     --executor-cores 1 \
>     --queue thequeue \
>     lib/spark-examples*.jar
>
>
> I didnt see example of ./bin/spark-class in 1.2.0 documentation, so am
> wondering if that is deprecated.
>
>
>
>
>
> On Mon, Mar 23, 2015 at 12:11 PM, Sandy Ryza <sa...@cloudera.com>
> wrote:
>
>> The mode is not deprecated, but the name "yarn-standalone" is now
>> deprecated.  It's now referred to as "yarn-cluster".
>>
>> -Sandy
>>
>> On Mon, Mar 23, 2015 at 11:49 AM, nitinkak001 <ni...@gmail.com>
>> wrote:
>>
>>> Is yarn-standalone mode deprecated in Spark now. The reason I am asking
>>> is
>>> because while I can find it in 0.9.0
>>> documentation(https://spark.apache.org/docs/0.9.0/running-on-yarn.html).
>>> I
>>> am not able to find it in 1.2.0.
>>>
>>> I am using this mode to run the Spark jobs from Oozie as a java action.
>>> Removing this mode will prevent me from doing that. Are there any other
>>> ways
>>> of running a Spark job from Oozie other than Shell action?
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Is-yarn-standalone-mode-deprecated-tp22188.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>

Re: Is yarn-standalone mode deprecated?

Posted by Sandy Ryza <sa...@cloudera.com>.
The mode is not deprecated, but the name "yarn-standalone" is now
deprecated.  It's now referred to as "yarn-cluster".

-Sandy

On Mon, Mar 23, 2015 at 11:49 AM, nitinkak001 <ni...@gmail.com> wrote:

> Is yarn-standalone mode deprecated in Spark now. The reason I am asking is
> because while I can find it in 0.9.0
> documentation(https://spark.apache.org/docs/0.9.0/running-on-yarn.html). I
> am not able to find it in 1.2.0.
>
> I am using this mode to run the Spark jobs from Oozie as a java action.
> Removing this mode will prevent me from doing that. Are there any other
> ways
> of running a Spark job from Oozie other than Shell action?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Is-yarn-standalone-mode-deprecated-tp22188.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>