You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mohammed Guller <mo...@glassbeam.com> on 2014/12/06 08:25:32 UTC

Fair scheduling accross applications in stand-alone mode

Hi -

I understand that one can use "spark.deploy.defaultCores" and "spark.cores.max" to assign a fixed number of worker cores to different apps. However, instead of statically assigning the cores, I would like Spark to dynamically assign the cores to multiple apps. For example, when there is a single app running, that app gets the entire cluster resources, but when other apps are submitted, resources that are free get assigned to the new apps.

Is there any configuration setting to achieve this in stand-alone mode?

Mohammed