You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:04:42 UTC

[jira] [Updated] (SPARK-23287) Spark scheduler does not remove initial executor if not one job submitted

     [ https://issues.apache.org/jira/browse/SPARK-23287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-23287:
---------------------------------
    Labels: bulk-closed  (was: )

> Spark scheduler does not remove initial executor if not one job submitted
> -------------------------------------------------------------------------
>
>                 Key: SPARK-23287
>                 URL: https://issues.apache.org/jira/browse/SPARK-23287
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos, Scheduler
>    Affects Versions: 2.2.1
>         Environment: Cluster manager - Mesos 1.4.1
> Spark 2.2.1
> spark app configuration:
> spark.dynamicAllocation.minExecutors=0
> spark.dynamicAllocation.executorIdleTimeout=25s
> spark.dynamicAllocation.initialExecutors=1
> spark.dynamicAllocation.schedulerBacklogTimeout=4s
> spark.dynamicAllocation.sustainedSchedulerBacklogTimeout=5s
>            Reporter: Pavel Plotnikov
>            Priority: Minor
>              Labels: bulk-closed
>
> When spark application submitted it deploy initial number of executors. If none of job has been submitted to application spark doesn't remove initial executor.
>  
> Cluster manager - Mesos 1.4.1
> Spark 2.2.1
> spark app configuration:
> spark.dynamicAllocation.minExecutors=0
> spark.dynamicAllocation.executorIdleTimeout=25s
> spark.dynamicAllocation.initialExecutors=1
> spark.dynamicAllocation.schedulerBacklogTimeout=4s
> spark.dynamicAllocation.sustainedSchedulerBacklogTimeout=5s



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org