You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by tdas <gi...@git.apache.org> on 2018/08/01 05:50:46 UTC

[GitHub] spark pull request #21622: [SPARK-24637][SS] Add metrics regarding state and...

Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21622#discussion_r206761192
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/MetricsReporter.scala ---
    @@ -39,6 +42,23 @@ class MetricsReporter(
       registerGauge("processingRate-total", _.processedRowsPerSecond, 0.0)
       registerGauge("latency", _.durationMs.get("triggerExecution").longValue(), 0L)
     
    +  private val timestampFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'") // ISO8601
    +  timestampFormat.setTimeZone(DateTimeUtils.getTimeZone("UTC"))
    +
    +  registerGauge("eventTime-watermark",
    +    progress => convertStringDateToMillis(progress.eventTime.get("watermark")), 0L)
    +
    +  registerGauge("states-rowsTotal", _.stateOperators.map(_.numRowsTotal).sum, 0L)
    +  registerGauge("states-usedBytes", _.stateOperators.map(_.memoryUsedBytes).sum, 0L)
    +
    --- End diff --
    
    Those are custom metrics, which may or may not be present depending on the implementation of state store. I dont recommend adding them here directly.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org