You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "jchen5 (via GitHub)" <gi...@apache.org> on 2023/04/20 22:48:32 UTC

[GitHub] [spark] jchen5 commented on a diff in pull request #40865: [SPARK-43156][SQL] Fix `COUNT(*) is null` bug in correlated scalar subquery

jchen5 commented on code in PR #40865:
URL: https://github.com/apache/spark/pull/40865#discussion_r1173128023


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/subquery.scala:
##########
@@ -599,10 +600,32 @@ object RewriteCorrelatedScalarSubquery extends Rule[LogicalPlan] with AliasHelpe
         if (Utils.isTesting) {
           assert(mayHaveCountBug.isDefined)
         }
+
+        def queryOutputFoldable(list: Seq[NamedExpression]): Boolean = {
+          trimAliases(list.filter(p => p.exprId.equals(query.output.head.exprId)).head).foldable
+        }
+
+        lazy val resultFoldable = {
+          query match {
+            case Project(expressions, _) =>
+              queryOutputFoldable(expressions)
+            case Aggregate(_, expressions, _) =>
+              queryOutputFoldable(expressions)
+            case _ =>
+              false
+          }
+        }
+
         if (resultWithZeroTups.isEmpty) {
           // CASE 1: Subquery guaranteed not to have the COUNT bug because it evaluates to NULL
           // with zero tuples.
           planWithoutCountBug
+        } else if (resultFoldable) {

Review Comment:
   I just changed this code in https://github.com/apache/spark/pull/40811 so you'll need to merge with that - your check should go after my new one, because if the subquery is something like `select false from ... group by c` then it will still actually return null on empty inputs. I have a test case for it added in my PR.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org