You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nan Zhu <zh...@gmail.com> on 2013/10/24 05:31:32 UTC

dynamically resizing Spark cluster

Hi, all  

I’m deploying a spark cluster on EC2, which is shared by multiple users

I would like to resize the cluster when most of users are silent for saving the cost

I’m running one worker on each instance, if I directly shutdown some instances, and some tasks are just running on there,  

what will happen on Spark? it will recover those tasks with something like speculative execution? or the job will unfortunately fail?

Best,

--  
Nan Zhu
School of Computer Science,
McGill University