You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/06/11 08:56:19 UTC

[jira] [Resolved] (SPARK-20935) A daemon thread, "BatchedWriteAheadLog Writer", left behind after terminating StreamingContext.

     [ https://issues.apache.org/jira/browse/SPARK-20935?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-20935.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

Issue resolved by pull request 18224
[https://github.com/apache/spark/pull/18224]

> A daemon thread, "BatchedWriteAheadLog Writer", left behind after terminating StreamingContext.
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-20935
>                 URL: https://issues.apache.org/jira/browse/SPARK-20935
>             Project: Spark
>          Issue Type: Bug
>          Components: DStreams
>    Affects Versions: 1.6.3, 2.1.1
>            Reporter: Terence Yim
>             Fix For: 2.3.0
>
>
> With batched write ahead log on by default in driver (SPARK-11731), if there is no receiver based {{InputDStream}}, the "BatchedWriteAheadLog Writer" thread created by {{BatchedWriteAheadLog}} never get shutdown. 
> The root cause is due to https://github.com/apache/spark/blob/master/streaming/src/main/scala/org/apache/spark/streaming/scheduler/ReceiverTracker.scala#L168
> that it never call {{ReceivedBlockTracker.stop()}} (which in turn call {{BatchedWriteAheadLog.close()}}) if there is no receiver based input.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org