You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Soumya Simanta <so...@gmail.com> on 2014/09/19 22:37:48 UTC

Problem with giving memory to executors on YARN

I'm launching a Spark shell with the following parameters

./spark-shell --master yarn-client --executor-memory 32g --driver-memory 4g
--executor-cores 32 --num-executors 8

but when I look at the Spark UI it shows only 209.3 GB total memory.


Executors (10)

   - *Memory:* 55.9 GB Used (209.3 GB Total)

This is a 10 node YARN cluster where each node has 48G of memory.

Any idea what I'm missing here?

Thanks
-Soumya

Re: Problem with giving memory to executors on YARN

Posted by Sandy Ryza <sa...@cloudera.com>.
I'm actually surprised your memory is that high. Spark only allocates
spark.storage.memoryFraction for storing RDDs.  This defaults to .6, so 32
GB * .6 * 10 executors should be a total of 192 GB.

-Sandy

On Sat, Sep 20, 2014 at 8:21 AM, Soumya Simanta <so...@gmail.com>
wrote:

> There 128 cores on each box. Yes there are other applications running on
> the cluster. YARN is assigning two containers to my application. I'll
> investigate this a little more. PS: I'm new to YARN.
>
>
>
> On Fri, Sep 19, 2014 at 4:49 PM, Vipul Pandey <vi...@gmail.com> wrote:
>
>> How many cores do you have in your boxes?
>> looks like you are assigning 32 cores "per" executor - is that what you
>> want?  are there other applications running on the cluster? you might want
>> to check YARN UI to see how many containers are getting allocated to your
>> application.
>>
>>
>> On Sep 19, 2014, at 1:37 PM, Soumya Simanta <so...@gmail.com>
>> wrote:
>>
>> I'm launching a Spark shell with the following parameters
>>
>> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory
>> 4g --executor-cores 32 --num-executors 8
>>
>> but when I look at the Spark UI it shows only 209.3 GB total memory.
>>
>>
>> Executors (10)
>>
>>    - *Memory:* 55.9 GB Used (209.3 GB Total)
>>
>> This is a 10 node YARN cluster where each node has 48G of memory.
>>
>> Any idea what I'm missing here?
>>
>> Thanks
>> -Soumya
>>
>>
>>
>>
>>
>

Re: Problem with giving memory to executors on YARN

Posted by Soumya Simanta <so...@gmail.com>.
There 128 cores on each box. Yes there are other applications running on
the cluster. YARN is assigning two containers to my application. I'll
investigate this a little more. PS: I'm new to YARN.



On Fri, Sep 19, 2014 at 4:49 PM, Vipul Pandey <vi...@gmail.com> wrote:

> How many cores do you have in your boxes?
> looks like you are assigning 32 cores "per" executor - is that what you
> want?  are there other applications running on the cluster? you might want
> to check YARN UI to see how many containers are getting allocated to your
> application.
>
>
> On Sep 19, 2014, at 1:37 PM, Soumya Simanta <so...@gmail.com>
> wrote:
>
> I'm launching a Spark shell with the following parameters
>
> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory
> 4g --executor-cores 32 --num-executors 8
>
> but when I look at the Spark UI it shows only 209.3 GB total memory.
>
>
> Executors (10)
>
>    - *Memory:* 55.9 GB Used (209.3 GB Total)
>
> This is a 10 node YARN cluster where each node has 48G of memory.
>
> Any idea what I'm missing here?
>
> Thanks
> -Soumya
>
>
>
>
>

Re: Problem with giving memory to executors on YARN

Posted by Vipul Pandey <vi...@gmail.com>.
How many cores do you have in your boxes?
looks like you are assigning 32 cores "per" executor - is that what you want?  are there other applications running on the cluster? you might want to check YARN UI to see how many containers are getting allocated to your application. 


On Sep 19, 2014, at 1:37 PM, Soumya Simanta <so...@gmail.com> wrote:

> I'm launching a Spark shell with the following parameters
> 
> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory 4g --executor-cores 32 --num-executors 8 
> 
> but when I look at the Spark UI it shows only 209.3 GB total memory. 
> 
> 
> Executors (10)
> Memory: 55.9 GB Used (209.3 GB Total)
> This is a 10 node YARN cluster where each node has 48G of memory. 
> 
> Any idea what I'm missing here? 
> 
> Thanks
> -Soumya
> 
> 
>