You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2019/07/16 16:42:12 UTC

[jira] [Updated] (SPARK-26957) Add config properties to configure the default scheduler pool priorities

     [ https://issues.apache.org/jira/browse/SPARK-26957?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-26957:
----------------------------------
    Affects Version/s:     (was: 2.4.0)
                       3.0.0

> Add config properties to configure the default scheduler pool priorities
> ------------------------------------------------------------------------
>
>                 Key: SPARK-26957
>                 URL: https://issues.apache.org/jira/browse/SPARK-26957
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 3.0.0
>            Reporter: Dave DeCaprio
>            Priority: Minor
>
> Currently, it is possible to dynamically create new scheduler pools in Spark just by setting {{spark.scheduler.pool.}} to a new value.
> We use this capability to create separate pools for different projects that run jobs in the same long-lived driver application. Each project gets its own pool, and within the pool jobs are executed in a FIFO manner.
> This setup works well, except that we also have a low priority queue for background tasks. We would prefer for all of the dynamic pools to have a higher priority than this background queue. 
>  We can accomplish this by hardcoding the project queue names in a spark_pools.xml config file and setting their priority to 100.
> Unfortunately, there is no way to set the priority for dynamically created pools.  They are all hardcoded to 1.  It would be nice if there were configuration settings to change this.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org