You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ashok Kumar <as...@yahoo.com.INVALID> on 2016/05/12 23:07:01 UTC

Spark handling spill overs

Hi,
How one can avoid having Spark spill over after filling the node's memory.
Thanks



Re: Spark handling spill overs

Posted by Mich Talebzadeh <mi...@gmail.com>.
Spill-overs are a common issue for in-memory computing systems, after all
memory is limited. In Spark where RDDs are immutable, if an RDD got created
with its size > 1/2 node's RAM then a transformation and generation of the
consequent RDD' can potentially fill all the node's memory that can  cause
the spill-over into swap space.

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 13 May 2016 at 00:38, Takeshi Yamamuro <li...@gmail.com> wrote:

> Hi,
>
> Which version of Spark you use?
> The recent one cannot handle this kind of spilling, see:
> http://spark.apache.org/docs/latest/tuning.html#memory-management-overview
> .
>
> // maropu
>
> On Fri, May 13, 2016 at 8:07 AM, Ashok Kumar <ashok34668@yahoo.com.invalid
> > wrote:
>
>> Hi,
>>
>> How one can avoid having Spark spill over after filling the node's memory.
>>
>> Thanks
>>
>>
>>
>>
>
>
> --
> ---
> Takeshi Yamamuro
>

Re: Spark handling spill overs

Posted by Takeshi Yamamuro <li...@gmail.com>.
Hi,

Which version of Spark you use?
The recent one cannot handle this kind of spilling, see:
http://spark.apache.org/docs/latest/tuning.html#memory-management-overview.

// maropu

On Fri, May 13, 2016 at 8:07 AM, Ashok Kumar <as...@yahoo.com.invalid>
wrote:

> Hi,
>
> How one can avoid having Spark spill over after filling the node's memory.
>
> Thanks
>
>
>
>


-- 
---
Takeshi Yamamuro