You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/08/01 05:08:04 UTC

[jira] [Assigned] (SPARK-9519) Confirm stop sc successfully when application was killed

     [ https://issues.apache.org/jira/browse/SPARK-9519?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-9519:
-----------------------------------

    Assignee: Apache Spark

> Confirm stop sc successfully when application was killed
> --------------------------------------------------------
>
>                 Key: SPARK-9519
>                 URL: https://issues.apache.org/jira/browse/SPARK-9519
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>            Reporter: Weizhong
>            Assignee: Apache Spark
>            Priority: Minor
>
> Currently, when we kill application on Yarn, then will call sc.stop() at Yarn application state monitor thread, then in YarnClientSchedulerBackend.stop() will call interrupt this will cause SparkContext not stop fully as we will wait executor to exit.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org