You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Subarna Chatterjee <te...@gmail.com> on 2017/10/30 09:55:07 UTC

Spark Streaming problem

Hello,

I am new with Spark and am doing a clustered deployment of a Kafka-based
wordcount application on Spark.

My problem is that the spark ui does not show me any application-id (using
which I can check the execution stages of my jobs). I can just see
driver-is for the Kafka producer and the consumer. Both the Kafka producer
and the consumer are running fine and I am able to see the output as well.
But the application-id is important for me to check the execution steps. I
am attaching the screenshot of my ui.  Can anyone kindly help?

Thanks and regards
Subarna