You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Isca Harmatz <po...@gmail.com> on 2014/12/03 05:57:27 UTC

Monitoring Spark

hello,

im running spark on a cluster and i want to monitor how many nodes/ cores
are active in different (specific) points of the program.

is there any way to do this?

thanks,
  Isca

Re: Monitoring Spark

Posted by Andrew Or <an...@databricks.com>.
If you're only interested in a particular instant, a simpler way is to
check the executors page on the Spark UI:
http://spark.apache.org/docs/latest/monitoring.html. By default each
executor runs one task per core, so you can see how many tasks are being
run at a given time and this translates directly to how many cores are
being used for execution.

2014-12-02 21:49 GMT-08:00 Otis Gospodnetic <ot...@gmail.com>:

> Hi Isca,
>
> I think SPM can do that for you:
> http://blog.sematext.com/2014/10/07/apache-spark-monitoring/
>
> Otis
> --
> Monitoring * Alerting * Anomaly Detection * Centralized Log Management
> Solr & Elasticsearch Support * http://sematext.com/
>
>
> On Tue, Dec 2, 2014 at 11:57 PM, Isca Harmatz <po...@gmail.com> wrote:
>
>> hello,
>>
>> im running spark on a cluster and i want to monitor how many nodes/ cores
>> are active in different (specific) points of the program.
>>
>> is there any way to do this?
>>
>> thanks,
>>   Isca
>>
>
>

Re: Monitoring Spark

Posted by Otis Gospodnetic <ot...@gmail.com>.
Hi Isca,

I think SPM can do that for you:
http://blog.sematext.com/2014/10/07/apache-spark-monitoring/

Otis
--
Monitoring * Alerting * Anomaly Detection * Centralized Log Management
Solr & Elasticsearch Support * http://sematext.com/


On Tue, Dec 2, 2014 at 11:57 PM, Isca Harmatz <po...@gmail.com> wrote:

> hello,
>
> im running spark on a cluster and i want to monitor how many nodes/ cores
> are active in different (specific) points of the program.
>
> is there any way to do this?
>
> thanks,
>   Isca
>