You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Mayur Rustagi <ma...@gmail.com> on 2014/05/31 09:17:01 UTC

Fwd: Monitoring / Instrumenting jobs in 1.0

We have a json feed of spark application interface that we use for easier
instrumentation & monitoring. Has that been considered/found relevant?
Already sent as a pull request to 0.9.0, would that work or should we
update it to 1.0.0?


Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



---------- Forwarded message ----------
From: Patrick Wendell <pw...@gmail.com>
Date: Sat, May 31, 2014 at 9:09 AM
Subject: Re: Monitoring / Instrumenting jobs in 1.0
To: user@spark.apache.org


The main change here was refactoring the SparkListener interface which
is where we expose internal state about a Spark job to other
applications. We've cleaned up these API's a bunch and also added a
way to log all data as JSON for post-hoc analysis:

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/SparkListener.scala

- Patrick

On Fri, May 30, 2014 at 7:09 AM, Daniel Siegmann
<da...@velos.io> wrote:
> The Spark 1.0.0 release notes state "Internal instrumentation has been
added
> to allow applications to monitor and instrument Spark jobs." Can anyone
> point me to the docs for this?
>
> --
> Daniel Siegmann, Software Developer
> Velos
> Accelerating Machine Learning
>
> 440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001
> E: daniel.siegmann@velos.io W: www.velos.io