You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/10/25 12:38:00 UTC

[jira] [Updated] (SPARK-22343) Add support for publishing Spark metrics into Prometheus

     [ https://issues.apache.org/jira/browse/SPARK-22343?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-22343:
------------------------------
    Target Version/s:   (was: 2.2.0)

> Add support for publishing Spark metrics into Prometheus
> --------------------------------------------------------
>
>                 Key: SPARK-22343
>                 URL: https://issues.apache.org/jira/browse/SPARK-22343
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Janos Matyas
>
> I've created a PR for supporting Prometheus a metric sink for Spark - in the https://github.com/apache-spark-on-k8s/spark fork. Based on the maintainers of the project I should create an issue here as well, in order to be tracked upstream as well. See below the original text of the PR: 
> _
> Publishing Spark metrics into Prometheus - as discussed earlier in #384. Implemented a metrics sink that publishes Spark metrics into Prometheus via [Prometheus Pushgateway](https://prometheus.io/docs/instrumenting/pushing/). Metrics data published by Spark is based on [Dropwizard](http://metrics.dropwizard.io/). The format of Spark metrics is not supported natively by Prometheus thus these are converted using [DropwizardExports](https://prometheus.io/client_java/io/prometheus/client/dropwizard/DropwizardExports.html) prior pushing metrics to the pushgateway.
> Also the default Prometheus pushgateway client API implementation does not support metrics timestamp thus the client API has been ehanced to enrich metrics  data with timestamp. _



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org