You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Isca Harmatz <po...@gmail.com> on 2014/12/02 06:18:05 UTC

Monitoring Spark

hello,

im running spark on a cluster and i want to monitor how many nodes/ cores
are active in different (specific) points of the program.

is there any way to do this?

thanks,
  Isca