You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/17 00:29:29 UTC

[GitHub] [spark] huaxingao commented on a change in pull request #34291: [SPARK-XXX][SQL][WIP] DS V2 LIMIT push down

huaxingao commented on a change in pull request #34291:
URL: https://github.com/apache/spark/pull/34291#discussion_r729982797



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2ScanRelationPushDown.scala
##########
@@ -225,6 +226,31 @@ object V2ScanRelationPushDown extends Rule[LogicalPlan] with PredicateHelper {
       withProjection
   }
 
+  def applyLimit(plan: LogicalPlan): LogicalPlan = plan.transform {
+    case globalLimit @ GlobalLimit(_, LocalLimit(limitExpr, child)) => child match {

Review comment:
       Technically, it's no problem to only match `LocalLimit`. 
   I looked the code, seems to me that the only case that we have `LocalLimit` without `GlobalLimit` is when we push down `LocalLimit` beneath `UNION ALL` and joins in this rule  https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala#L645. I think we won't hit this case because `UNION ALL` and joins are not pushed down for data source.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org