You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/09/08 08:40:00 UTC

[jira] [Commented] (SPARK-36693) Implement spark-shell idle timeouts

    [ https://issues.apache.org/jira/browse/SPARK-36693?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17411789#comment-17411789 ] 

Apache Spark commented on SPARK-36693:
--------------------------------------

User 'gyogal' has created a pull request for this issue:
https://github.com/apache/spark/pull/33936

> Implement spark-shell idle timeouts
> -----------------------------------
>
>                 Key: SPARK-36693
>                 URL: https://issues.apache.org/jira/browse/SPARK-36693
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Shell
>    Affects Versions: 3.1.2
>            Reporter: Gyorgy Gal
>            Priority: Major
>
> Many customers have been asking if there is a setting they can use to kill idle spark-shell since they can't really go to each developers desk and force them to use Cntr+D or exit() when their work is over. Our response so far has been to use dynamic allocation so that it will release the executors after the specified timeout.
> However this is not always an ideal solution since the shell process would still be there, though AM would be occupying a very small resource and the user still needs to kill the idle spark shell via CM> Applications > spark-shell > Kill or run 'kill -9' on the OS to remove those. It would be nice to have a property in Spark (and exposed in CM) which deals with idle spark-shells, just like we have in beeline and let's leave it to the admins to see if they want the idle spark-shell timeout to be set as 1 day or a week.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org