You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jayadevan M (JIRA)" <ji...@apache.org> on 2016/01/18 18:01:39 UTC

[jira] [Commented] (SPARK-11137) Make StreamingContext.stop() exception-safe

    [ https://issues.apache.org/jira/browse/SPARK-11137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15105523#comment-15105523 ] 

Jayadevan M commented on SPARK-11137:
-------------------------------------

[~thomastechs]
working on the fix

> Make StreamingContext.stop() exception-safe
> -------------------------------------------
>
>                 Key: SPARK-11137
>                 URL: https://issues.apache.org/jira/browse/SPARK-11137
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.5.1
>            Reporter: Felix Cheung
>            Priority: Minor
>
> In StreamingContext.stop(), when an exception is thrown the rest of the stop/cleanup action is aborted.
> Discussed in https://github.com/apache/spark/pull/9116,
> srowen commented
> Hm, this is getting unwieldy. There are several nested try blocks here. The same argument goes for many of these methods -- if one fails should they not continue trying? A more tidy solution would be to execute a series of () -> Unit code blocks that perform some cleanup and make sure that they each fire in succession, regardless of the others. The final one to remove the shutdown hook could occur outside synchronization.
> I realize we're expanding the scope of the change here, but is it maybe worthwhile to go all the way here?
> Really, something similar could be done for SparkContext and there's an existing JIRA for it somewhere.
> At least, I'd prefer to either narrowly fix the deadlock here, or fix all of the finally-related issue separately and all at once.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org