You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/08/01 05:08:04 UTC

[jira] [Commented] (SPARK-9519) Confirm stop sc successfully when application was killed

    [ https://issues.apache.org/jira/browse/SPARK-9519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14650122#comment-14650122 ] 

Apache Spark commented on SPARK-9519:
-------------------------------------

User 'Sephiroth-Lin' has created a pull request for this issue:
https://github.com/apache/spark/pull/7846

> Confirm stop sc successfully when application was killed
> --------------------------------------------------------
>
>                 Key: SPARK-9519
>                 URL: https://issues.apache.org/jira/browse/SPARK-9519
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>            Reporter: Weizhong
>            Priority: Minor
>
> Currently, when we kill application on Yarn, then will call sc.stop() at Yarn application state monitor thread, then in YarnClientSchedulerBackend.stop() will call interrupt this will cause SparkContext not stop fully as we will wait executor to exit.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org