You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by charles li <ch...@gmail.com> on 2016/02/05 06:26:26 UTC

spark.executor.memory ? is used just for cache RDD or both cache RDD and the runtime of cores on worker?

if set spark.executor.memory = 2G for each worker [ 10 in total ]

does it mean I can cache 20G RDD in memory ? if so, how about the memory
for code running in each process on each worker?

thanks.


--
and is there any materials about memory management or resource management
in spark ? I want to put spark in production, but have little knowing about
the resource management in spark, great thanks again


-- 
*--------------------------------------*
a spark lover, a quant, a developer and a good man.

http://github.com/litaotao

Re: spark.executor.memory ? is used just for cache RDD or both cache RDD and the runtime of cores on worker?

Posted by Rishi Mishra <rm...@snappydata.io>.
You would probably like to see
http://spark.apache.org/docs/latest/configuration.html#memory-management.
Other config parameters are also explained there.

On Fri, Feb 5, 2016 at 10:56 AM, charles li <ch...@gmail.com> wrote:

> if set spark.executor.memory = 2G for each worker [ 10 in total ]
>
> does it mean I can cache 20G RDD in memory ? if so, how about the memory
> for code running in each process on each worker?
>
> thanks.
>
>
> --
> and is there any materials about memory management or resource management
> in spark ? I want to put spark in production, but have little knowing about
> the resource management in spark, great thanks again
>
>
> --
> *--------------------------------------*
> a spark lover, a quant, a developer and a good man.
>
> http://github.com/litaotao
>



-- 
Regards,
Rishitesh Mishra,
SnappyData . (http://www.snappydata.io/)

https://in.linkedin.com/in/rishiteshmishra