You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "t oo (JIRA)" <ji...@apache.org> on 2019/02/17 10:15:00 UTC

[jira] [Commented] (SPARK-20286) dynamicAllocation.executorIdleTimeout is ignored after unpersist

    [ https://issues.apache.org/jira/browse/SPARK-20286?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16770344#comment-16770344 ] 

t oo commented on SPARK-20286:
------------------------------

gentle ping

> dynamicAllocation.executorIdleTimeout is ignored after unpersist
> ----------------------------------------------------------------
>
>                 Key: SPARK-20286
>                 URL: https://issues.apache.org/jira/browse/SPARK-20286
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.1
>            Reporter: Miguel PĂ©rez
>            Priority: Major
>
> With dynamic allocation enabled, it seems that executors with cached data which are unpersisted are still being killed using the {{dynamicAllocation.cachedExecutorIdleTimeout}} configuration, instead of {{dynamicAllocation.executorIdleTimeout}}. Assuming the default configuration ({{dynamicAllocation.cachedExecutorIdleTimeout = Infinity}}), an executor with unpersisted data won't be released until the job ends.
> *How to reproduce*
> - Set different values for {{dynamicAllocation.executorIdleTimeout}} and {{dynamicAllocation.cachedExecutorIdleTimeout}}
> - Load a file into a RDD and persist it
> - Execute an action on the RDD (like a count) so some executors are activated.
> - When the action has finished, unpersist the RDD
> - The application UI removes correctly the persisted data from the *Storage* tab, but if you look in the *Executors* tab, you will find that the executors remain *active* until ({{dynamicAllocation.cachedExecutorIdleTimeout}} is reached.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org