You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sadaf <sa...@platalytics.com> on 2015/08/09 09:52:31 UTC

ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by driver

Hi
When i tried to stop spark streaming using ssc.stop(false,true) It gives the
following error.

ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by driver
15/08/07 13:41:11 WARN ReceiverSupervisorImpl: Stopped executor without
error
15/08/07 13:41:20 WARN WriteAheadLogManager : Failed to write to write ahead
log

I've implemented Streaming Listener and a Custom Receiver. Does anyone has
idea about this?

Thanks :)




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ERROR-ReceiverTracker-Deregistered-receiver-for-stream-0-Stopped-by-driver-tp24183.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by driver

Posted by Sadaf Khan <sa...@platalytics.com>.
okay.
Then do you have any idea how to avoid this error?

Thanks

On Tue, Aug 11, 2015 at 12:08 AM, Tathagata Das <td...@databricks.com> wrote:

> I think this may be expected. When the streaming context is stopped
> without the SparkContext, then the receivers are stopped by the driver. The
> receiver sends back the message that it has been stopped. This is being
> (probably incorrectly) logged with ERROR level.
>
> On Sun, Aug 9, 2015 at 12:52 AM, Sadaf <sa...@platalytics.com> wrote:
>
>> Hi
>> When i tried to stop spark streaming using ssc.stop(false,true) It gives
>> the
>> following error.
>>
>> ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by
>> driver
>> 15/08/07 13:41:11 WARN ReceiverSupervisorImpl: Stopped executor without
>> error
>> 15/08/07 13:41:20 WARN WriteAheadLogManager : Failed to write to write
>> ahead
>> log
>>
>> I've implemented Streaming Listener and a Custom Receiver. Does anyone has
>> idea about this?
>>
>> Thanks :)
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/ERROR-ReceiverTracker-Deregistered-receiver-for-stream-0-Stopped-by-driver-tp24183.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by driver

Posted by Tathagata Das <td...@databricks.com>.
I think this may be expected. When the streaming context is stopped without
the SparkContext, then the receivers are stopped by the driver. The
receiver sends back the message that it has been stopped. This is being
(probably incorrectly) logged with ERROR level.

On Sun, Aug 9, 2015 at 12:52 AM, Sadaf <sa...@platalytics.com> wrote:

> Hi
> When i tried to stop spark streaming using ssc.stop(false,true) It gives
> the
> following error.
>
> ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by
> driver
> 15/08/07 13:41:11 WARN ReceiverSupervisorImpl: Stopped executor without
> error
> 15/08/07 13:41:20 WARN WriteAheadLogManager : Failed to write to write
> ahead
> log
>
> I've implemented Streaming Listener and a Custom Receiver. Does anyone has
> idea about this?
>
> Thanks :)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/ERROR-ReceiverTracker-Deregistered-receiver-for-stream-0-Stopped-by-driver-tp24183.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>