You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Ambi, Aniket" <An...@harman.com> on 2018/09/21 09:00:51 UTC

Spark Use Case Analysis

Hi Team,
I am trying one use case using Spark Streaming and I am not sure If I can solve it using spark.

My spark stream will listen to multiple Kafka topics where each topic will receives various counters with diff values.
I need to process multiple (around 200) KPI expressions using those counters and publish back onto Kafka.
My problem is, to calculate one particular KPI, I am not sure in which batch those counters will be present. And I need to calculate such 200 KPIs.
I am thinking about Structured Streaming to keep the state by unable to fit the solution into it.
Please help.




Thanks,
Aniket