You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dave DeCaprio (JIRA)" <ji...@apache.org> on 2019/02/21 17:33:00 UTC
[jira] [Created] (SPARK-26957) Add config properties to configure
the default scheduler pool priorities
Dave DeCaprio created SPARK-26957:
-------------------------------------
Summary: Add config properties to configure the default scheduler pool priorities
Key: SPARK-26957
URL: https://issues.apache.org/jira/browse/SPARK-26957
Project: Spark
Issue Type: Improvement
Components: Scheduler
Affects Versions: 2.4.0
Reporter: Dave DeCaprio
Currently, it is possible to dynamically create new scheduler pools in Spark just by setting {{spark.scheduler.pool.}} to a new value.
We use this capability to create separate pools for different projects that run jobs in the same long-lived driver application. Each project gets its own pool, and within the pool jobs are executed in a FIFO manner.
This setup works well, except that we also have a low priority queue for background tasks. We would prefer for all of the dynamic pools to have a higher priority than this background queue.
We can accomplish this by hardcoding the project queue names in a spark_pools.xml config file and setting their priority to 100.
Unfortunately, there is no way to set the priority for dynamically created pools. They are all hardcoded to 1. It would be nice if there were configuration settings to change this.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org