You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anna stax <an...@gmail.com> on 2017/06/28 00:03:03 UTC
Spark standalone , client mode. How do I monitor?
Hi all,
I have a spark standalone cluster. I am running a spark streaming
application on it and the deploy mode is client. I am looking for the best
way to monitor the cluster and application so that I will know when the
application/cluster is down. I cannot move to cluster deploy mode now.
I appreciate your thoughts.
Thanks
-Anna
Re: Spark standalone , client mode. How do I monitor?
Posted by Nirav Patel <np...@xactlycorp.com>.
you can use ganglia, ambari or nagios to monitor spark workers/masters.
Spark executors are resilient. There are may proprietary software companies
as well that just do hadoop application monitoring.
On Tue, Jun 27, 2017 at 5:03 PM, anna stax <an...@gmail.com> wrote:
> Hi all,
>
> I have a spark standalone cluster. I am running a spark streaming
> application on it and the deploy mode is client. I am looking for the best
> way to monitor the cluster and application so that I will know when the
> application/cluster is down. I cannot move to cluster deploy mode now.
>
> I appreciate your thoughts.
>
> Thanks
> -Anna
>
--
[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>
<https://www.nyse.com/quote/XNYS:XTLY> [image: LinkedIn]
<https://www.linkedin.com/company/xactly-corporation> [image: Twitter]
<https://twitter.com/Xactly> [image: Facebook]
<https://www.facebook.com/XactlyCorp> [image: YouTube]
<http://www.youtube.com/xactlycorporation>