You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/02 21:36:09 UTC

[GitHub] [spark] rxin commented on a change in pull request #24271: [SPAR-27342][SQL] Optimize Limit 0 queries

rxin commented on a change in pull request #24271: [SPAR-27342][SQL] Optimize Limit 0 queries
URL: https://github.com/apache/spark/pull/24271#discussion_r271505321
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/PropagateEmptyRelation.scala
 ##########
 @@ -108,5 +108,27 @@ object PropagateEmptyRelation extends Rule[LogicalPlan] with PredicateHelper wit
       case Generate(_: Explode, _, _, _, _, _) => empty(p)
       case _ => p
     }
+
+    // Nodes below GlobalLimit or LocalLimit can be pruned if the limit value is zero (0).
+    // Any subtree in the logical plan that has GlobalLimit 0 or LocalLimit 0 as its root is
+    // semantically equivalent to an empty relation.
+    //
+    // In such cases, the effects of Limit 0 can be propagated through the Logical Plan by replacing
+    // the (Global/Local) Limit subtree with an empty LocalRelation, thereby pruning the subtree
+    // below and triggering other optimization rules of PropagateEmptyRelation to propagate the
+    // changes up the Logical Plan.
+    //
+    // Replace Global Limit 0 nodes with empty Local Relation
+    case p @ GlobalLimit(IntegerLiteral(limit), _) if limit == 0 =>
 
 Review comment:
   it's better to do a new rule, to make it more modular.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org