You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/01 07:09:27 UTC

[GitHub] [spark] cloud-fan commented on a change in pull request #24253: [SPARK-19712][SQL][FOLLOW-UP] Don't do partial pushdown when pushing down LeftAnti joins below Aggregate or Window operators.

cloud-fan commented on a change in pull request #24253: [SPARK-19712][SQL][FOLLOW-UP] Don't do partial pushdown when pushing down LeftAnti joins below Aggregate or Window operators.
URL: https://github.com/apache/spark/pull/24253#discussion_r270736877
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/PushDownLeftSemiAntiJoin.scala
 ##########
 @@ -70,10 +70,12 @@ object PushDownLeftSemiAntiJoin extends Rule[LogicalPlan] with PredicateHelper {
         }
 
         // Check if the remaining predicates do not contain columns from the right
-        // hand side of the join. Since the remaining predicates will be kept
-        // as a filter over aggregate, this check is necessary after the left semi
-        // or left anti join is moved below aggregate. The reason is, for this kind
-        // of join, we only output from the left leg of the join.
+        // hand side of the join. In case of LeftSemi join, since remaining predicates
 
 Review comment:
   shall we move the part of the comments about diffrent join type to
   ```
   joinType match {
     case LeftSemi => Filter(stayUp.reduce(And), newAgg)
     case _ => join
   }
   ```
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org