You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Juan Rodríguez Hortalá <ju...@gmail.com> on 2015/07/13 12:35:01 UTC

Stopping StreamingContext before receiver has started

Hi,

I have noticed that when StreamingContext.stop is called when no receiver
has started yet, then the context is not really stopped. Watching the logs
it looks like a stop signal is sent to 0 receivers, because the receivers
have not started yet, and then the receivers are started and the context
keeps working

15/07/13 12:24:52 INFO ReceiverTracker: Sent stop signal to all 0 receivers
15/07/13 12:24:52 INFO RecurringTimer: Started timer for BlockGenerator at
time 1436783092600
15/07/13 12:24:52 INFO BlockGenerator: Started BlockGenerator
15/07/13 12:24:52 INFO BlockGenerator: Started block pushing thread
15/07/13 12:24:52 INFO ReceiverSupervisorImpl: Starting receiver
15/07/13 12:24:52 INFO ActorReceiver: Supervision tree for receivers
initialized at:akka://sparkDriver/user/Supervisor0
15/07/13 12:24:52 INFO ActorReceiver: Started receiver worker
at:akka://sparkDriver/user/Supervisor0/ReceiverActorFoo
15/07/13 12:24:52 INFO ReceiverSupervisorImpl: Called receiver onStart
15/07/13 12:24:52 INFO ProxyReceiverActor: Starting
es.ucm.fdi.sscheck.testing.ProxyReceiverActor
Actor[akka://sparkDriver/user/Supervisor0/ReceiverActorFoo#629970050]
15/07/13 12:24:52 INFO ReceiverTracker: Registered receiver for stream 0
from 192.168.56.1:49724

My workaround is using a streaming listener to wait for onReceiverStarted,
so I don't call stop before that event has occurred
https://gist.github.com/juanrh/b43cdad2d1250b676794 . This is a minor issue
but I think it is a surprising behaviour that maybe should be documented
somewhere.

Greetings,

Juan

Re: Stopping StreamingContext before receiver has started

Posted by Tathagata Das <td...@databricks.com>.
This is a known race condition - root cause of SPARK-5681
<https://issues.apache.org/jira/browse/SPARK-5681>

On Mon, Jul 13, 2015 at 3:35 AM, Juan Rodríguez Hortalá <
juan.rodriguez.hortala@gmail.com> wrote:

> Hi,
>
> I have noticed that when StreamingContext.stop is called when no receiver
> has started yet, then the context is not really stopped. Watching the logs
> it looks like a stop signal is sent to 0 receivers, because the receivers
> have not started yet, and then the receivers are started and the context
> keeps working
>
> 15/07/13 12:24:52 INFO ReceiverTracker: Sent stop signal to all 0 receivers
> 15/07/13 12:24:52 INFO RecurringTimer: Started timer for BlockGenerator at
> time 1436783092600
> 15/07/13 12:24:52 INFO BlockGenerator: Started BlockGenerator
> 15/07/13 12:24:52 INFO BlockGenerator: Started block pushing thread
> 15/07/13 12:24:52 INFO ReceiverSupervisorImpl: Starting receiver
> 15/07/13 12:24:52 INFO ActorReceiver: Supervision tree for receivers
> initialized at:akka://sparkDriver/user/Supervisor0
> 15/07/13 12:24:52 INFO ActorReceiver: Started receiver worker
> at:akka://sparkDriver/user/Supervisor0/ReceiverActorFoo
> 15/07/13 12:24:52 INFO ReceiverSupervisorImpl: Called receiver onStart
> 15/07/13 12:24:52 INFO ProxyReceiverActor: Starting
> es.ucm.fdi.sscheck.testing.ProxyReceiverActor
> Actor[akka://sparkDriver/user/Supervisor0/ReceiverActorFoo#629970050]
> 15/07/13 12:24:52 INFO ReceiverTracker: Registered receiver for stream 0
> from 192.168.56.1:49724
>
> My workaround is using a streaming listener to wait for onReceiverStarted,
> so I don't call stop before that event has occurred
> https://gist.github.com/juanrh/b43cdad2d1250b676794 . This is a minor
> issue but I think it is a surprising behaviour that maybe should be
> documented somewhere.
>
> Greetings,
>
> Juan
>