You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/29 15:59:35 UTC

[GitHub] [spark] cloud-fan commented on a diff in pull request #37706: [SPARK-39915][SQL] Dataset.repartition(N) may not create N partitions Non-AQE part

cloud-fan commented on code in PR #37706:
URL: https://github.com/apache/spark/pull/37706#discussion_r957520604


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/PropagateEmptyRelation.scala:
##########
@@ -162,13 +171,49 @@ abstract class PropagateEmptyRelationBase extends Rule[LogicalPlan] with CastSup
       case _ => p
     }
   }
+
+  protected def userSpecifiedRepartition(p: LogicalPlan): Boolean = p match {
+    case _: Repartition => true
+    case r: RepartitionByExpression
+      if r.optNumPartitions.isDefined || r.partitionExpressions.nonEmpty => true
+    case _ => false
+  }
+
+  protected val repartitionTreePattern: TreePattern = REPARTITION_OPERATION
+  protected def applyInternal(plan: LogicalPlan): LogicalPlan
+
+  /**
+   * Add a [[ROOT_REPARTITION]] tag for the root user-specified repartition so this rule can
+   * skip optimize it.
+   */
+  protected def addTagForRootRepartition(plan: LogicalPlan): LogicalPlan = {
+    var isRootRepartition = true
+    plan.transformDownWithPruning(_.containsPattern(repartitionTreePattern)) {

Review Comment:
   I think it's better to use a manual traversal here. We can stop traversal once we hit a node that is not repartition/project/filter.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org