You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nirav Patel <np...@xactlycorp.com> on 2016/06/07 18:13:27 UTC

Spark dynamic allocation - efficiently request new resource

Hi,

Do current or future(2.0) spark dynamic allocation have capability to
request a container with varying resource requirements based on various
factor? Few factors I can think of is based on stage and data its
processing it can either ask for more CPUs or more Memory. i.e. new
executor can have different number of CPU cores or memory available for all
of its task.
That way spark can process data skew with heavier executor by assigning
more Memory or CPUs to new executors.

Thanks
Nirav

-- 


[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>

<https://www.nyse.com/quote/XNYS:XTLY>  [image: LinkedIn] 
<https://www.linkedin.com/company/xactly-corporation>  [image: Twitter] 
<https://twitter.com/Xactly>  [image: Facebook] 
<https://www.facebook.com/XactlyCorp>  [image: YouTube] 
<http://www.youtube.com/xactlycorporation>