You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Eren Avsarogullari (Jira)" <ji...@apache.org> on 2020/06/29 23:31:00 UTC

[jira] [Commented] (SPARK-32026) Add PrometheusServlet Unit Test coverage

    [ https://issues.apache.org/jira/browse/SPARK-32026?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17148211#comment-17148211 ] 

Eren Avsarogullari commented on SPARK-32026:
--------------------------------------------

Currently, _PrometheusServlet_ follows Spark 3.0 JMX Sink + Prometheus JMX Exporter format so for this Jira coverage, just _PrometheusServletSuite_ is added in the light of discussion on https://github.com/apache/spark/pull/28865.
 

> Add PrometheusServlet Unit Test coverage
> ----------------------------------------
>
>                 Key: SPARK-32026
>                 URL: https://issues.apache.org/jira/browse/SPARK-32026
>             Project: Spark
>          Issue Type: Task
>          Components: Spark Core
>    Affects Versions: 3.0.1
>            Reporter: Eren Avsarogullari
>            Priority: Major
>
> Spark 3.0 introduces native Prometheus Sink for both driver and executor metrics. However, they need consistency on format (e.g: `applicationId`). Currently, driver covers `applicationId` in metric name. If this can extract as executor metric format, this can also support consistency and help to query.
> *Driver*
> {code:java}
> metrics_local_1592242896665_driver_BlockManager_memory_memUsed_MB_Value{type="gauges"} 0{code}
> *Executor*
> {code:java}
> metrics_executor_memoryUsed_bytes{application_id="local-1592242896665", application_name="apache-spark-fundamentals", executor_id="driver"} 24356{code}
> *Proposed Driver Format*
> {code:java}
> metrics_driver_BlockManager_memory_memUsed_MB_Value{application_id="local-1592242896665", type="gauges"} 0{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org