You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Chengxiang Li (JIRA)" <ji...@apache.org> on 2014/11/25 10:30:12 UTC

[jira] [Updated] (HIVE-8959) SparkSession is not closed until JVM exit.[Spark Branch]

     [ https://issues.apache.org/jira/browse/HIVE-8959?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chengxiang Li updated HIVE-8959:
--------------------------------
    Status: Patch Available  (was: Open)

> SparkSession is not closed until JVM exit.[Spark Branch]
> --------------------------------------------------------
>
>                 Key: HIVE-8959
>                 URL: https://issues.apache.org/jira/browse/HIVE-8959
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>              Labels: Spark-M3
>         Attachments: HIVE-8959.1-spark.patch
>
>
> During unit test, SparkSession is closed by Runtime shutdownHook, which means it's closed until JVM exist. During unit test suite, each qfile, as a single test case, would reset SessionState, which lead to a new Sparksession is created for each qfile. As we know that, RemoteSparkClient is SparkSession specified, so more and more executors is launched during unit test until blocked by no more resources.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)