You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Eren Avsarogullari (JIRA)" <ji...@apache.org> on 2016/10/02 11:52:20 UTC
[jira] [Created] (SPARK-17759) SchedulableBuilder should avoid to
create duplicate fair scheduler-pools.
Eren Avsarogullari created SPARK-17759:
------------------------------------------
Summary: SchedulableBuilder should avoid to create duplicate fair scheduler-pools.
Key: SPARK-17759
URL: https://issues.apache.org/jira/browse/SPARK-17759
Project: Spark
Issue Type: Bug
Components: Scheduler
Affects Versions: 2.1.0
Reporter: Eren Avsarogullari
If _spark.scheduler.allocation.file_ has duplicate pools, all of them are created when _SparkContext_ is initialized but just one of them is used and the other ones look redundant. This causes _redundant pool_ creation and needs to be fixed.
*Code to Reproduce* :
{code:scala}
val conf = new SparkConf().setAppName("spark-fairscheduler").setMaster("local")
conf.set("spark.scheduler.mode", "FAIR")
conf.set("spark.scheduler.allocation.file", "src/main/resources/fairscheduler-duplicate-pools.xml")
val sc = new SparkContext(conf)
{code}
*fairscheduler-duplicate-pools.xml* :
The following sample just shows two default and duplicate_pool1 but this can also be thought for N default and/or other duplicate pools.
{code:xml}
<allocations>
<pool name="default">
<minShare>0</minShare>
<weight>1</weight>
<schedulingMode>FAIR</schedulingMode>
</pool>
<pool name="default">
<minShare>0</minShare>
<weight>1</weight>
<schedulingMode>FAIR</schedulingMode>
</pool>
<pool name="duplicate_pool1">
<minShare>1</minShare>
<weight>1</weight>
<schedulingMode>FAIR</schedulingMode>
</pool>
<pool name="duplicate_pool1">
<minShare>2</minShare>
<weight>2</weight>
<schedulingMode>FAIR</schedulingMode>
</pool>
</allocations>
{code}
*Debug Screenshot* :
This means Pool.schedulableQueue(ConcurrentLinkedQueue[Schedulable]) has 4 pools as default, default, duplicate_pool1, duplicate_pool1 but Pool.schedulableNameToSchedulable(ConcurrentHashMap[String, Schedulable]) has default and duplicate_pool1 due to pool name as key so one of default and duplicate_pool1 look as redundant and live in Pool.schedulableQueue.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org