You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by skmishra <si...@gmail.com> on 2018/05/27 04:48:06 UTC

Aggregation of Streaming UI Statistics for multiple jobs

Hi,

I am working on a streaming use case where I need to run multiple spark
streaming applications at the same time and measure the throughput and
latencies. The spark UI provides all the statistics, but if I want to run
more than 100 applications at the same time then I do not have any clue on
how to aggregate these statistics. Opening 100 windows and collecting all
the data does not seem to be an easy job. Hence, if you could provide any
help on how to collect these statistics from code, then I can write a script
to run my experiment. Any help is greatly appreciated. Thanks in advance.

Regards,
Sitakanta Mishra 



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Aggregation of Streaming UI Statistics for multiple jobs

Posted by hemant singh <he...@gmail.com>.
You can explore the rest API -
https://spark.apache.org/docs/2.0.2/monitoring.html#rest-api

On Sun, May 27, 2018 at 10:18 AM, skmishra <si...@gmail.com>
wrote:

> Hi,
>
> I am working on a streaming use case where I need to run multiple spark
> streaming applications at the same time and measure the throughput and
> latencies. The spark UI provides all the statistics, but if I want to run
> more than 100 applications at the same time then I do not have any clue on
> how to aggregate these statistics. Opening 100 windows and collecting all
> the data does not seem to be an easy job. Hence, if you could provide any
> help on how to collect these statistics from code, then I can write a
> script
> to run my experiment. Any help is greatly appreciated. Thanks in advance.
>
> Regards,
> Sitakanta Mishra
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>