You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (JIRA)" <ji...@apache.org> on 2018/06/12 18:56:00 UTC

[jira] [Assigned] (SPARK-24416) Update configuration definition for spark.blacklist.killBlacklistedExecutors

     [ https://issues.apache.org/jira/browse/SPARK-24416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Imran Rashid reassigned SPARK-24416:
------------------------------------

    Assignee: Sanket Reddy

> Update configuration definition for spark.blacklist.killBlacklistedExecutors
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-24416
>                 URL: https://issues.apache.org/jira/browse/SPARK-24416
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Sanket Reddy
>            Assignee: Sanket Reddy
>            Priority: Minor
>             Fix For: 2.4.0
>
>
> spark.blacklist.killBlacklistedExecutors is defined asĀ 
> (Experimental) If set to "true", allow Spark to automatically kill, and attempt to re-create, executors when they are blacklisted. Note that, when an entire node is added to the blacklist, all of the executors on that node will be killed.
> I presume the killing of blacklisted executors only happens after the stage completes successfully and all tasks have completed or on fetch failures (updateBlacklistForFetchFailure/updateBlacklistForSuccessfulTaskSet). It is confusing because the definition states that the executor will be attempted to be recreated as soon as it is blacklisted. This is not true while the stage is in progress and an executor is blacklisted, it will not attempt to cleanup until the stage finishes.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org