You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Gerard Maas <ge...@gmail.com> on 2014/10/30 21:53:35 UTC

Registering custom metrics

vHi,

I've been exploring the metrics exposed by Spark and I'm wondering whether
there's a way to register job-specific metrics that could be exposed
through the existing metrics system.

Would there be an  example somewhere?

BTW, documentation about how the metrics work could be improved. I found
out about the default servlet and the metrics/json/ endpoint on the code. I
could not find any reference to that on the dedicated doc page [1].
Probably something I could contribute if there's nobody on that at the
moment.

-kr, Gerard.

[1]   http://spark.apache.org/docs/1.1.0/monitoring.html#Metrics

Re: Registering custom metrics

Posted by Dmitry Goldenberg <dg...@gmail.com>.
Great, thank you, Silvio. In your experience, is there any way to instument
a callback into Coda Hale or the Spark consumers from the metrics sink?  If
the sink performs some steps once it has received the metrics, I'd like to
be able to make the consumers aware of that via some sort of a callback..

On Mon, Jun 22, 2015 at 10:14 AM, Silvio Fiorito <
silvio.fiorito@granturing.com> wrote:

> Sorry, replied to Gerard’s question vs yours.
>
> See here:
>
> Yes, you have to implement your own custom Metrics Source using the Code
> Hale library. See here for some examples:
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
>
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala
>
> The source gets registered, then you have to configure a sink for it just
> as the JSON servlet you mentioned.
>
> I had done it in the past but don’t have the access to the source for that
> project anymore unfortunately.
>
> Thanks,
> Silvio
>
>
>
>
>
>
> On 6/22/15, 9:57 AM, "dgoldenberg" <dg...@gmail.com> wrote:
>
> >Hi Gerard,
> >
> >Have there been any responses? Any insights as to what you ended up doing
> to
> >enable custom metrics? I'm thinking of implementing a custom metrics sink,
> >not sure how doable that is yet...
> >
> >Thanks.
> >
> >
> >
> >--
> >View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
> >Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> >---------------------------------------------------------------------
> >To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >For additional commands, e-mail: user-help@spark.apache.org
> >
>

Re: Registering custom metrics

Posted by Silvio Fiorito <si...@granturing.com>.
Sorry, replied to Gerard’s question vs yours.

See here:

Yes, you have to implement your own custom Metrics Source using the Code Hale library. See here for some examples: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

The source gets registered, then you have to configure a sink for it just as the JSON servlet you mentioned.

I had done it in the past but don’t have the access to the source for that project anymore unfortunately.

Thanks,
Silvio






On 6/22/15, 9:57 AM, "dgoldenberg" <dg...@gmail.com> wrote:

>Hi Gerard,
>
>Have there been any responses? Any insights as to what you ended up doing to
>enable custom metrics? I'm thinking of implementing a custom metrics sink,
>not sure how doable that is yet...
>
>Thanks.
>
>
>
>--
>View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>For additional commands, e-mail: user-help@spark.apache.org
>

Re: Registering custom metrics

Posted by Otis Gospodnetić <ot...@gmail.com>.
Hi,

Not sure if this will fit your needs, but if you are trying to
collect+chart some metrics specific to your app, yet want to correlate them
with what's going on in Spark, maybe Spark's performance numbers, you may
want to send your custom metrics to SPM, so they can be
visualized/analyzed/"dashboarded" along with your Spark metrics. See
http://sematext.com/spm/integrations/spark-monitoring.html for the Spark
piece and https://sematext.atlassian.net/wiki/display/PUBSPM/Custom+Metrics
for Custom Metrics.  If you use Coda Hale's metrics lib, that works, too,
there is a pluggable reported that will send Coda Hale metrics to SPM, too.

HTH.

Otis
--
Monitoring * Alerting * Anomaly Detection * Centralized Log Management
Solr & Elasticsearch Support * http://sematext.com/


On Mon, Jun 22, 2015 at 9:57 AM, dgoldenberg <dg...@gmail.com>
wrote:

> Hi Gerard,
>
> Have there been any responses? Any insights as to what you ended up doing
> to
> enable custom metrics? I'm thinking of implementing a custom metrics sink,
> not sure how doable that is yet...
>
> Thanks.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Registering custom metrics

Posted by dgoldenberg <dg...@gmail.com>.
Hi Gerard,

Have there been any responses? Any insights as to what you ended up doing to
enable custom metrics? I'm thinking of implementing a custom metrics sink,
not sure how doable that is yet...

Thanks.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Registering custom metrics

Posted by Gerard Maas <ge...@gmail.com>.
Very interesting approach. Thanks for sharing it!

On Thu, Jan 8, 2015 at 5:30 PM, Enno Shioji <es...@gmail.com> wrote:

> FYI I found this approach by Ooyala.
>
> /** Instrumentation for Spark based on accumulators.
>   *
>   * Usage:
>   * val instrumentation = new SparkInstrumentation("example.metrics")
>   * val numReqs = sc.accumulator(0L)
>   * instrumentation.source.registerDailyAccumulator(numReqs, "numReqs")
>   * instrumentation.register()
>   *
>   * Will create and report the following metrics:
>   * - Gauge with total number of requests (daily)
>   * - Meter with rate of requests
>   *
>   * @param prefix prefix for all metrics that will be reported by this Instrumentation
>   */
>
> https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23
> ᐧ
>
> On Mon, Jan 5, 2015 at 2:56 PM, Enno Shioji <es...@gmail.com> wrote:
>
>> Hi Gerard,
>>
>> Thanks for the answer! I had a good look at it, but I couldn't figure out
>> whether one can use that to emit metrics from your application code.
>>
>> Suppose I wanted to monitor the rate of bytes I produce, like so:
>>
>>     stream
>>         .map { input =>
>>           val bytes = produce(input)
>>           // metricRegistry.meter("some.metrics").mark(bytes.length)
>>           bytes
>>         }
>>         .saveAsTextFile("text")
>>
>> Is there a way to achieve this with the MetricSystem?
>>
>>
>> ᐧ
>>
>> On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas <ge...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> Yes, I managed to create a register custom metrics by creating an
>>>  implementation  of org.apache.spark.metrics.source.Source and
>>> registering it to the metrics subsystem.
>>> Source is [Spark] private, so you need to create it under a org.apache.spark
>>> package. In my case, I'm dealing with Spark Streaming metrics, and I
>>> created my CustomStreamingSource under org.apache.spark.streaming as I
>>> also needed access to some [Streaming] private components.
>>>
>>> Then, you register your new metric Source on the Spark's metric system,
>>> like so:
>>>
>>> SparkEnv.get.metricsSystem.registerSource(customStreamingSource)
>>>
>>> And it will get reported to the metrics Sync active on your system. By
>>> default, you can access them through the metric endpoint:
>>> http://<driver-host>:<ui-port>/metrics/json
>>>
>>> I hope this helps.
>>>
>>> -kr, Gerard.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Tue, Dec 30, 2014 at 3:32 PM, eshioji <es...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> Did you find a way to do this / working on this?
>>>> Am trying to find a way to do this as well, but haven't been able to
>>>> find a
>>>> way.
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
>>>> Sent from the Apache Spark Developers List mailing list archive at
>>>> Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: Registering custom metrics

Posted by Gerard Maas <ge...@gmail.com>.
Very interesting approach. Thanks for sharing it!

On Thu, Jan 8, 2015 at 5:30 PM, Enno Shioji <es...@gmail.com> wrote:

> FYI I found this approach by Ooyala.
>
> /** Instrumentation for Spark based on accumulators.
>   *
>   * Usage:
>   * val instrumentation = new SparkInstrumentation("example.metrics")
>   * val numReqs = sc.accumulator(0L)
>   * instrumentation.source.registerDailyAccumulator(numReqs, "numReqs")
>   * instrumentation.register()
>   *
>   * Will create and report the following metrics:
>   * - Gauge with total number of requests (daily)
>   * - Meter with rate of requests
>   *
>   * @param prefix prefix for all metrics that will be reported by this Instrumentation
>   */
>
> https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23
> ᐧ
>
> On Mon, Jan 5, 2015 at 2:56 PM, Enno Shioji <es...@gmail.com> wrote:
>
>> Hi Gerard,
>>
>> Thanks for the answer! I had a good look at it, but I couldn't figure out
>> whether one can use that to emit metrics from your application code.
>>
>> Suppose I wanted to monitor the rate of bytes I produce, like so:
>>
>>     stream
>>         .map { input =>
>>           val bytes = produce(input)
>>           // metricRegistry.meter("some.metrics").mark(bytes.length)
>>           bytes
>>         }
>>         .saveAsTextFile("text")
>>
>> Is there a way to achieve this with the MetricSystem?
>>
>>
>> ᐧ
>>
>> On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas <ge...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> Yes, I managed to create a register custom metrics by creating an
>>>  implementation  of org.apache.spark.metrics.source.Source and
>>> registering it to the metrics subsystem.
>>> Source is [Spark] private, so you need to create it under a org.apache.spark
>>> package. In my case, I'm dealing with Spark Streaming metrics, and I
>>> created my CustomStreamingSource under org.apache.spark.streaming as I
>>> also needed access to some [Streaming] private components.
>>>
>>> Then, you register your new metric Source on the Spark's metric system,
>>> like so:
>>>
>>> SparkEnv.get.metricsSystem.registerSource(customStreamingSource)
>>>
>>> And it will get reported to the metrics Sync active on your system. By
>>> default, you can access them through the metric endpoint:
>>> http://<driver-host>:<ui-port>/metrics/json
>>>
>>> I hope this helps.
>>>
>>> -kr, Gerard.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Tue, Dec 30, 2014 at 3:32 PM, eshioji <es...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> Did you find a way to do this / working on this?
>>>> Am trying to find a way to do this as well, but haven't been able to
>>>> find a
>>>> way.
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
>>>> Sent from the Apache Spark Developers List mailing list archive at
>>>> Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: Registering custom metrics

Posted by Enno Shioji <es...@gmail.com>.
FYI I found this approach by Ooyala.

/** Instrumentation for Spark based on accumulators.
  *
  * Usage:
  * val instrumentation = new SparkInstrumentation("example.metrics")
  * val numReqs = sc.accumulator(0L)
  * instrumentation.source.registerDailyAccumulator(numReqs, "numReqs")
  * instrumentation.register()
  *
  * Will create and report the following metrics:
  * - Gauge with total number of requests (daily)
  * - Meter with rate of requests
  *
  * @param prefix prefix for all metrics that will be reported by this
Instrumentation
  */

https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23
ᐧ

On Mon, Jan 5, 2015 at 2:56 PM, Enno Shioji <es...@gmail.com> wrote:

> Hi Gerard,
>
> Thanks for the answer! I had a good look at it, but I couldn't figure out
> whether one can use that to emit metrics from your application code.
>
> Suppose I wanted to monitor the rate of bytes I produce, like so:
>
>     stream
>         .map { input =>
>           val bytes = produce(input)
>           // metricRegistry.meter("some.metrics").mark(bytes.length)
>           bytes
>         }
>         .saveAsTextFile("text")
>
> Is there a way to achieve this with the MetricSystem?
>
>
> ᐧ
>
> On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas <ge...@gmail.com>
> wrote:
>
>> Hi,
>>
>> Yes, I managed to create a register custom metrics by creating an
>>  implementation  of org.apache.spark.metrics.source.Source and
>> registering it to the metrics subsystem.
>> Source is [Spark] private, so you need to create it under a org.apache.spark
>> package. In my case, I'm dealing with Spark Streaming metrics, and I
>> created my CustomStreamingSource under org.apache.spark.streaming as I
>> also needed access to some [Streaming] private components.
>>
>> Then, you register your new metric Source on the Spark's metric system,
>> like so:
>>
>> SparkEnv.get.metricsSystem.registerSource(customStreamingSource)
>>
>> And it will get reported to the metrics Sync active on your system. By
>> default, you can access them through the metric endpoint:
>> http://<driver-host>:<ui-port>/metrics/json
>>
>> I hope this helps.
>>
>> -kr, Gerard.
>>
>>
>>
>>
>>
>>
>> On Tue, Dec 30, 2014 at 3:32 PM, eshioji <es...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> Did you find a way to do this / working on this?
>>> Am trying to find a way to do this as well, but haven't been able to
>>> find a
>>> way.
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
>>> Sent from the Apache Spark Developers List mailing list archive at
>>> Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>

Re: Registering custom metrics

Posted by Enno Shioji <es...@gmail.com>.
FYI I found this approach by Ooyala.

/** Instrumentation for Spark based on accumulators.
  *
  * Usage:
  * val instrumentation = new SparkInstrumentation("example.metrics")
  * val numReqs = sc.accumulator(0L)
  * instrumentation.source.registerDailyAccumulator(numReqs, "numReqs")
  * instrumentation.register()
  *
  * Will create and report the following metrics:
  * - Gauge with total number of requests (daily)
  * - Meter with rate of requests
  *
  * @param prefix prefix for all metrics that will be reported by this
Instrumentation
  */

https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23
ᐧ

On Mon, Jan 5, 2015 at 2:56 PM, Enno Shioji <es...@gmail.com> wrote:

> Hi Gerard,
>
> Thanks for the answer! I had a good look at it, but I couldn't figure out
> whether one can use that to emit metrics from your application code.
>
> Suppose I wanted to monitor the rate of bytes I produce, like so:
>
>     stream
>         .map { input =>
>           val bytes = produce(input)
>           // metricRegistry.meter("some.metrics").mark(bytes.length)
>           bytes
>         }
>         .saveAsTextFile("text")
>
> Is there a way to achieve this with the MetricSystem?
>
>
> ᐧ
>
> On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas <ge...@gmail.com>
> wrote:
>
>> Hi,
>>
>> Yes, I managed to create a register custom metrics by creating an
>>  implementation  of org.apache.spark.metrics.source.Source and
>> registering it to the metrics subsystem.
>> Source is [Spark] private, so you need to create it under a org.apache.spark
>> package. In my case, I'm dealing with Spark Streaming metrics, and I
>> created my CustomStreamingSource under org.apache.spark.streaming as I
>> also needed access to some [Streaming] private components.
>>
>> Then, you register your new metric Source on the Spark's metric system,
>> like so:
>>
>> SparkEnv.get.metricsSystem.registerSource(customStreamingSource)
>>
>> And it will get reported to the metrics Sync active on your system. By
>> default, you can access them through the metric endpoint:
>> http://<driver-host>:<ui-port>/metrics/json
>>
>> I hope this helps.
>>
>> -kr, Gerard.
>>
>>
>>
>>
>>
>>
>> On Tue, Dec 30, 2014 at 3:32 PM, eshioji <es...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> Did you find a way to do this / working on this?
>>> Am trying to find a way to do this as well, but haven't been able to
>>> find a
>>> way.
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
>>> Sent from the Apache Spark Developers List mailing list archive at
>>> Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>

Re: Registering custom metrics

Posted by Enno Shioji <es...@gmail.com>.
Hi Gerard,

Thanks for the answer! I had a good look at it, but I couldn't figure out
whether one can use that to emit metrics from your application code.

Suppose I wanted to monitor the rate of bytes I produce, like so:

    stream
        .map { input =>
          val bytes = produce(input)
          // metricRegistry.meter("some.metrics").mark(bytes.length)
          bytes
        }
        .saveAsTextFile("text")

Is there a way to achieve this with the MetricSystem?


ᐧ

On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas <ge...@gmail.com> wrote:

> Hi,
>
> Yes, I managed to create a register custom metrics by creating an
>  implementation  of org.apache.spark.metrics.source.Source and
> registering it to the metrics subsystem.
> Source is [Spark] private, so you need to create it under a org.apache.spark
> package. In my case, I'm dealing with Spark Streaming metrics, and I
> created my CustomStreamingSource under org.apache.spark.streaming as I
> also needed access to some [Streaming] private components.
>
> Then, you register your new metric Source on the Spark's metric system,
> like so:
>
> SparkEnv.get.metricsSystem.registerSource(customStreamingSource)
>
> And it will get reported to the metrics Sync active on your system. By
> default, you can access them through the metric endpoint:
> http://<driver-host>:<ui-port>/metrics/json
>
> I hope this helps.
>
> -kr, Gerard.
>
>
>
>
>
>
> On Tue, Dec 30, 2014 at 3:32 PM, eshioji <es...@gmail.com> wrote:
>
>> Hi,
>>
>> Did you find a way to do this / working on this?
>> Am trying to find a way to do this as well, but haven't been able to find
>> a
>> way.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>

Re: Registering custom metrics

Posted by Gerard Maas <ge...@gmail.com>.
Hi,

Yes, I managed to create a register custom metrics by creating an
 implementation  of org.apache.spark.metrics.source.Source and registering
it to the metrics subsystem.
Source is [Spark] private, so you need to create it under a org.apache.spark
package. In my case, I'm dealing with Spark Streaming metrics, and I
created my CustomStreamingSource under org.apache.spark.streaming as I also
needed access to some [Streaming] private components.

Then, you register your new metric Source on the Spark's metric system,
like so:

SparkEnv.get.metricsSystem.registerSource(customStreamingSource)

And it will get reported to the metrics Sync active on your system. By
default, you can access them through the metric endpoint:
http://<driver-host>:<ui-port>/metrics/json

I hope this helps.

-kr, Gerard.






On Tue, Dec 30, 2014 at 3:32 PM, eshioji <es...@gmail.com> wrote:

> Hi,
>
> Did you find a way to do this / working on this?
> Am trying to find a way to do this as well, but haven't been able to find a
> way.
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Registering custom metrics

Posted by eshioji <es...@gmail.com>.
Hi,

Did you find a way to do this / working on this?
Am trying to find a way to do this as well, but haven't been able to find a
way.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Registering custom metrics

Posted by Silvio Fiorito <si...@granturing.com>.
Hi Gerard,

Yes, you have to implement your own custom Metrics Source using the Code Hale library. See here for some examples: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

The source gets registered, then you have to configure a sink for it just as the JSON servlet you mentioned.

I had done it in the past but don’t have the access to the source for that project anymore unfortunately.

Thanks,
Silvio

From: Gerard Maas
Date: Thursday, October 30, 2014 at 4:53 PM
To: user, "dev@spark.apache.org<ma...@spark.apache.org>"
Subject: Registering custom metrics

vHi,

I've been exploring the metrics exposed by Spark and I'm wondering whether there's a way to register job-specific metrics that could be exposed through the existing metrics system.

Would there be an  example somewhere?

BTW, documentation about how the metrics work could be improved. I found out about the default servlet and the metrics/json/ endpoint on the code. I could not find any reference to that on the dedicated doc page [1]. Probably something I could contribute if there's nobody on that at the moment.

-kr, Gerard.

[1]   http://spark.apache.org/docs/1.1.0/monitoring.html#Metrics