You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by hequn cheng <ch...@gmail.com> on 2014/09/16 07:08:42 UTC

How to set executor num on spark on yarn

hi~I want to set the executor number to 16, but it is very strange that
executor cores may affect executor num on spark on yarn, i don't know why
and how to set executor number.
=============================================
./bin/spark-submit --class com.hequn.spark.SparkJoins \
    --master yarn-cluster \
    --num-executors 16 \
    --driver-memory 2g \
    --executor-memory 10g \
  *  --executor-cores 4 \*
    /home/sparkjoins-1.0-SNAPSHOT.jar

The UI shows there are *7 executors*
=============================================
./bin/spark-submit --class com.hequn.spark.SparkJoins \
    --master yarn-cluster \
    --num-executors 16 \
    --driver-memory 2g \
    --executor-memory 10g \
*    --executor-cores 2 \*
    /home/sparkjoins-1.0-SNAPSHOT.jar

The UI shows there are *9 executors*
=============================================
./bin/spark-submit --class com.hequn.spark.SparkJoins \
    --master yarn-cluster \
    --num-executors 16 \
    --driver-memory 2g \
    --executor-memory 10g \
*    --executor-cores 1 \*
    /home/sparkjoins-1.0-SNAPSHOT.jar

The UI shows there are *9 executors*
==============================================
The cluster contains 16 nodes. Each node 64G RAM.

Re: How to set executor num on spark on yarn

Posted by Sean Owen <so...@cloudera.com>.
How many cores do your machines have? --executor-cores should be the
number of cores each executor uses. Fewer cores means more executors
in general. From your data, it sounds like, for example, there are 7
nodes with 4+ cores available to YARN, and 2 more nodes with 2-3 cores
available. Hence when you ask for fewer cores per executor, more can
fit.

You may need to increase the number of cores you are letting YARN
manage if this doesn't match your expectation. For example, I'm
guessing your machines have more than 4 cores in reality.

On Tue, Sep 16, 2014 at 6:08 AM, hequn cheng <ch...@gmail.com> wrote:
> hi~I want to set the executor number to 16, but it is very strange that
> executor cores may affect executor num on spark on yarn, i don't know why
> and how to set executor number.
> =============================================
> ./bin/spark-submit --class com.hequn.spark.SparkJoins \
>     --master yarn-cluster \
>     --num-executors 16 \
>     --driver-memory 2g \
>     --executor-memory 10g \
>     --executor-cores 4 \
>     /home/sparkjoins-1.0-SNAPSHOT.jar
>
> The UI shows there are 7 executors
> =============================================
> ./bin/spark-submit --class com.hequn.spark.SparkJoins \
>     --master yarn-cluster \
>     --num-executors 16 \
>     --driver-memory 2g \
>     --executor-memory 10g \
>     --executor-cores 2 \
>     /home/sparkjoins-1.0-SNAPSHOT.jar
>
> The UI shows there are 9 executors
> =============================================
> ./bin/spark-submit --class com.hequn.spark.SparkJoins \
>     --master yarn-cluster \
>     --num-executors 16 \
>     --driver-memory 2g \
>     --executor-memory 10g \
>     --executor-cores 1 \
>     /home/sparkjoins-1.0-SNAPSHOT.jar
>
> The UI shows there are 9 executors
> ==============================================
> The cluster contains 16 nodes. Each node 64G RAM.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org