You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by khaledh <kh...@gmail.com> on 2014/09/29 19:45:35 UTC

Ack RabbitMQ messages after processing through Spark Streaming

Hi,

I'm currently investigating whether it's possible in Spark Streaming to send
back ack's to RabbitMQ after a message has gone through the processing
pipeline. The problem is that the Receiver is the one who has the RabbitMQ
channel open for receiving messages, but due to reliability concerns we
don't want to ack messages right away when they're received, we want to
defer that to the time they have been completely processed and persisted.

Now the problem is: how can the Receiver tell that a message has made it
through the pipeline and is safe to ack?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Ack-RabbitMQ-messages-after-processing-through-Spark-Streaming-tp15348.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Ack RabbitMQ messages after processing through Spark Streaming

Posted by khaledh <kh...@gmail.com>.
As a follow up to my own question, I see that the  FlumeBatchFetcher
<https://github.com/apache/spark/blob/master/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala>  
ack's the batch only after it calls Receiver.store(...).

So my question is: does store() guarantees that that after the call returns,
that the data is replicated across the Spark cluster? i.e. is it safe to ack
received data after calling store()?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Ack-RabbitMQ-messages-after-processing-through-Spark-Streaming-tp15348p15401.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org