You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Sahil Takiar (JIRA)" <ji...@apache.org> on 2018/10/24 16:28:00 UTC

[jira] [Updated] (HIVE-20519) Remove 30m min value for hive.spark.session.timeout

     [ https://issues.apache.org/jira/browse/HIVE-20519?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sahil Takiar updated HIVE-20519:
--------------------------------
    Attachment: HIVE-20519.3.patch

> Remove 30m min value for hive.spark.session.timeout
> ---------------------------------------------------
>
>                 Key: HIVE-20519
>                 URL: https://issues.apache.org/jira/browse/HIVE-20519
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-20519.1.patch, HIVE-20519.2.patch, HIVE-20519.3.patch
>
>
> In HIVE-14162 we added the config \{{hive.spark.session.timeout}} which provided a way to time out Spark sessions that are active for a long period of time. The config has a lower bound of 30m which we should remove. It should be possible for users to configure this value so the HoS session is closed as soon as the query is complete.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)