You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Han JU <ju...@gmail.com> on 2016/09/09 12:20:04 UTC
Get spark metrics in code
Hi,
I'd like to know if there's a possibility to get spark's metrics from code.
For example
val sc = new SparkContext(conf)
val result = myJob(sc, ...)
result.save(...)
val gauge = MetricSystem.getGauge("org.apahce.spark....")
println(gauge.getValue) // or send to to internal aggregation service
I'm aware that there's a configuration for sending metrics to several kinds
of sinks but I'm more interested in a per job basis style and we use a
custom log/metric aggregation service for building dashboards.
Thanks.
--
*JU Han*
Software Engineer @ Teads.tv
+33 0619608888
Re: Get spark metrics in code
Posted by Steve Loughran <st...@hortonworks.com>.
> On 9 Sep 2016, at 13:20, Han JU <ju...@gmail.com> wrote:
>
> Hi,
>
> I'd like to know if there's a possibility to get spark's metrics from code. For example
>
> val sc = new SparkContext(conf)
> val result = myJob(sc, ...)
> result.save(...)
>
> val gauge = MetricSystem.getGauge("org.apahce.spark....")
> println(gauge.getValue) // or send to to internal aggregation service
>
> I'm aware that there's a configuration for sending metrics to several kinds of sinks but I'm more interested in a per job basis style and we use a custom log/metric aggregation service for building dashboards.
>
It's all coda hale metrics; should be retrievable somehow, for a loose definition of "somehow"
I'd be interested in knowing what you come up with here.
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org