You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/03/31 17:36:57 UTC

[GitHub] [spark] wypoon commented on a change in pull request #31451: [SPARK-34338][SQL] Report metrics from Datasource v2 scan

wypoon commented on a change in pull request #31451:
URL: https://github.com/apache/spark/pull/31451#discussion_r605093466



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetricInfo.scala
##########
@@ -27,4 +27,5 @@ import org.apache.spark.annotation.DeveloperApi
 class SQLMetricInfo(
     val name: String,
     val accumulatorId: Long,
-    val metricType: String)
+    val metricType: String,
+    val aggregateMethod: (Array[Long], Array[Long]) => String)

Review comment:
       Yes, precisely, `SQLMetricInfo` objects are created from `SQLMetric` objects and `SQLPlanMetric` objects are created from `SQLMetricInfo` objects. Most of the time `SQLMetric` objects just need the default aggregateMethod. Hence it is natural for `SQLMetricInfo` and `SQLPlanMetric` to have this same default.
   A default is just that; in the non-test parts that this PR needed to touch, we have to explicitly set the aggregateMethod from the `SQLMetric`, because they have to apply to all `SQLMetric`s. But in other cases, for the test parts now, or for future uses that we don't know about now, the default may be applicable.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org