You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "zhihai xu (JIRA)" <ji...@apache.org> on 2017/04/14 21:57:42 UTC

[jira] [Created] (HIVE-16456) Kill spark job when InterruptedException happens or driverContext.isShutdown is true.

zhihai xu created HIVE-16456:
--------------------------------

             Summary: Kill spark job when InterruptedException happens or driverContext.isShutdown is true.
                 Key: HIVE-16456
                 URL: https://issues.apache.org/jira/browse/HIVE-16456
             Project: Hive
          Issue Type: Improvement
            Reporter: zhihai xu
            Assignee: zhihai xu
            Priority: Minor


Kill spark job when InterruptedException happens or driverContext.isShutdown is true. If the InterruptedException happened in RemoteSparkJobMonitor and LocalSparkJobMonitor, it will be better to kill the job. Also there is a race condition between submit the spark job and query/operation cancellation, it will be better to check driverContext.isShutdown right after submit the spark job. This will guarantee the job being killed no matter when shutdown is called. It is similar as HIVE-15997.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)