You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/01/29 06:00:00 UTC

[jira] [Commented] (SPARK-26758) Idle Executors are not getting killed after spark.dynamicAllocation.executorIdleTimeout value

    [ https://issues.apache.org/jira/browse/SPARK-26758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16754626#comment-16754626 ] 

Hyukjin Kwon commented on SPARK-26758:
--------------------------------------

Can you include UI screenshot to explain the issue in the JIRA description?


> Idle Executors are not getting killed after spark.dynamicAllocation.executorIdleTimeout value
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26758
>                 URL: https://issues.apache.org/jira/browse/SPARK-26758
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 2.4.0
>         Environment: Spark Version:2.4
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Major
>
> Steps:
> 1. Submit Spark shell with below initial Executor 3, minimum Executor=0 and executorIdleTimeout=60s
> {code}
> bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true \
>   --conf spark.dynamicAllocation.initialExecutors=3 \
>   --conf spark.dynamicAllocation.minExecutors=0 \
>   --conf spark.dynamicAllocation.executorIdleTimeout=60s
> {code}
> 2. Launch Spark UI and check under Executor Tab
> Observation:
> Initial 3 Executors assigned. After 60s( executorIdleTimeout) , number of active executor remains same.
> Expected:
> Apart from AM container, all other executors should be dead.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org