You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/07 17:17:19 UTC

[GitHub] [spark] ekoifman commented on a change in pull request #33641: [SPARK-36416][SQL] Add SQL metrics to AdaptiveSparkPlanExec for BHJs and Skew joins

ekoifman commented on a change in pull request #33641:
URL: https://github.com/apache/spark/pull/33641#discussion_r703694177



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/AdaptiveSparkPlanExec.scala
##########
@@ -79,6 +81,13 @@ case class AdaptiveSparkPlanExec(
     case _ => logDebug(_)
   }
 
+  override lazy val metrics = Map(
+    "num broadcast join conversions" ->

Review comment:
       Hello,
   @cloud-fan, @maryannxue @hvanhovell @viirya
   do any of you have  any suggesting on how to proceed?
   It would also be useful to add "reOptimize duration" to track total `reOptimize()` time and "generate explainString duration" to track `context.qe.explainString` which can be noticeable for large plans.
   These are also not exactly the same as other "metrics" but `AdaptiveSparkPlanExec` is also quite different from other nodes.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org