You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/10/15 07:09:33 UTC

[GitHub] [spark] maropu commented on a change in pull request #30018: [SPARK-33122][SQL] Remove redundant aggregates in the Optimzier

maropu commented on a change in pull request #30018:
URL: https://github.com/apache/spark/pull/30018#discussion_r505242243



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala
##########
@@ -477,6 +478,25 @@ object RemoveRedundantAliases extends Rule[LogicalPlan] {
   def apply(plan: LogicalPlan): LogicalPlan = removeRedundantAliases(plan, AttributeSet.empty)
 }
 
+/**
+ * Remove redundant aggregates from a query plan. A redundant aggregate is an aggregate whose
+ * only goal is to keep distinct values, while its parent aggregate would ignore duplicate values.
+ */
+object RemoveRedundantAggregates extends Rule[LogicalPlan] {
+  def apply(plan: LogicalPlan): LogicalPlan = plan transformUp {
+    case upper @ Aggregate(_, _, lower: Aggregate) if lowerIsRedundant(upper, lower) =>
+      upper.copy(child = lower.child)
+  }
+
+  private def lowerIsRedundant(upper: Aggregate, lower: Aggregate): Boolean = {
+    val upperReferencesOnlyGrouping = upper.references
+      .subsetOf(AttributeSet(lower.groupingExpressions))

Review comment:
       I think `CleanupAliases` removes aliases in `groupingExpressions` in the analyzer phase.
   
   ```
       val upperReferencesOnlyGrouping = upper.references
         .subsetOf(AttributeSet(lower.groupingExpressions))
   ```
   How about checking if upper references are a subset of non-aggregate expressions in a lower node instead?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org