You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by vibhatha <vi...@gmail.com> on 2019/10/30 01:51:11 UTC

Iterative Streaming with Spark

Hi,

I am doing a benchmark in Flink, Storm, Spark for an iterative streaming
application. 
The goal is to make a window for a stream and do an iterative computation
per window. 

Both Flink and Storm provides a window function with a list or iterator. But
in Spark, 
I  am not quite sure how to do this. Is it possible to get all the elements
in a window as a list or iterator or etc? 

And the goal after this windowed computation is to do a reduce operation to
get a
globally synchronized value. 

Is this possible with Spark Streaming? 



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org