You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by charles li <ch...@gmail.com> on 2016/01/21 06:33:56 UTC

best practice : how to manage your Spark cluster ?

I've put a thread before:  pre-install 3-party Python package on spark
cluster

currently I use *Fabric* to manage my cluster , but it's not enough for me,
and I believe there is a much better way to *manage and monitor* the
cluster.

I believe there really exists some open source manage tools which provides
a web UI allowing me to [ what I need exactly ]:


   - monitor the cluster machine's state in real-time, say memory, network,
   disk
   - list all the services, packages on each machine
   - install / uninstall / upgrade / downgrade package through a web UI
   - start / stop / restart services on that machine



great thanks

-- 
*--------------------------------------*
a spark lover, a quant, a developer and a good man.

http://github.com/litaotao

Re: best practice : how to manage your Spark cluster ?

Posted by Arkadiusz Bicz <ar...@gmail.com>.
Hi Charles,

We are using Ambari for hadoop / spark services management, version
and monitoring in cluster.

For Spark jobs and cluster hosts, discs, memory, cpu, network realtime
monitoring we use graphite + grafana + collectd + spark metrics

http://www.hammerlab.org/2015/02/27/monitoring-spark-with-graphite-and-grafana/

BR,

Arkadiusz Bicz

On Thu, Jan 21, 2016 at 5:33 AM, charles li <ch...@gmail.com> wrote:
> I've put a thread before:  pre-install 3-party Python package on spark
> cluster
>
> currently I use Fabric to manage my cluster , but it's not enough for me,
> and I believe there is a much better way to manage and monitor the cluster.
>
> I believe there really exists some open source manage tools which provides a
> web UI allowing me to [ what I need exactly ]:
>
> monitor the cluster machine's state in real-time, say memory, network, disk
> list all the services, packages on each machine
> install / uninstall / upgrade / downgrade package through a web UI
> start / stop / restart services on that machine
>
>
>
> great thanks
>
> --
> --------------------------------------
> a spark lover, a quant, a developer and a good man.
>
> http://github.com/litaotao

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org