You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Esa Heikkinen <es...@student.tut.fi> on 2018/05/28 07:19:30 UTC

Execution model in Spark

Hi

I don't know whether this question is suitable for this forum, but I take the risk and ask :)

In my understanding the execution model in Spark is very data (flow) stream oriented and specific. Is it difficult to build a control flow logic (like state-machine) outside of the stream specific processings ?

It is only way to combine all different type event streams to one big stream and then process it by some own stateful "logic" ?
And how to build this logic ?

Best Regards, Esa