You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Sahil Takiar (JIRA)" <ji...@apache.org> on 2018/10/16 13:49:00 UTC

[jira] [Commented] (HIVE-20519) Remove 30m min value for hive.spark.session.timeout

    [ https://issues.apache.org/jira/browse/HIVE-20519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16651727#comment-16651727 ] 

Sahil Takiar commented on HIVE-20519:
-------------------------------------

Another issue that needs to be fixed with this feature is that we should support closing a SparkSession while it is being opened. The old code was capable of doing that. Supporting this is important for query cancellation, especially on a busy cluster where it might take a while to start the Spark driver.

> Remove 30m min value for hive.spark.session.timeout
> ---------------------------------------------------
>
>                 Key: HIVE-20519
>                 URL: https://issues.apache.org/jira/browse/HIVE-20519
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-20519.1.patch, HIVE-20519.2.patch
>
>
> In HIVE-14162 we added the config \{{hive.spark.session.timeout}} which provided a way to time out Spark sessions that are active for a long period of time. The config has a lower bound of 30m which we should remove. It should be possible for users to configure this value so the HoS session is closed as soon as the query is complete.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)