You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "cloud-fan (via GitHub)" <gi...@apache.org> on 2023/11/01 16:03:23 UTC

[PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

cloud-fan opened a new pull request, #43623:
URL: https://github.com/apache/spark/pull/43623

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   Sometimes we need to duplicate expressions when rewriting the plan. It's OK for small query, as codegen has common-subexpression-elimination (CSE) to avoid evaluating the same expression. However, when the query is big, duplicating expressions can lead to a very big expression tree and make catalyst rules very slow, or even OOM when updating a leaf node (need to copy all tree nodes).
   
   This PR introduces a new expression to do expression-level CTE: it adds a Project to pre-evaluate the common expressions, so that they appear only once on the query plan tree, and are evaluated only once. `NullIf` now uses this new expression to avoid duplicating the `left` child expression.
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   make catalyst more efficient.
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   No
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   new test suite
   
   ### Was this patch authored or co-authored using generative AI tooling?
   <!--
   If generative AI tooling has been used in the process of authoring this patch, please include the
   phrase: 'Generated-by: ' followed by the name of the tool and its version.
   If no, write 'No'.
   Please refer to the [ASF Generative Tooling Guidance](https://www.apache.org/legal/generative-tooling.html) for details.
   -->
   No


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1789224288

   cc @viirya @wangyum 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1790377475

   BTW, I feel it's useful to have a way to do explicit/manual CSE, instead of relying on optimizer features or codegen features.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1790214519

   @viirya yes we can add SQL syntax in the future, following https://github.com/google/zetasql/blob/a745bef47b315bb11fecab4eeefa2bcc41be5951/docs/operators.md?plain=1#L2865


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "viirya (via GitHub)" <gi...@apache.org>.
viirya commented on code in PR #43623:
URL: https://github.com/apache/spark/pull/43623#discussion_r1379249466


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/RewriteWithExpression.scala:
##########
@@ -0,0 +1,99 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.optimizer
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeMap, CommonExpressionRef, Expression, With}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, Project}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.trees.TreePattern.{COMMON_EXPR_REF, WITH_EXPRESSION}
+
+/**
+ * Rewrites the `With` expressions by adding a `Project` to pre-evaluate the common expressions, or
+ * just inline them if they are cheap.
+ *
+ * Note: For now we only use `With` in a few `RuntimeReplaceable` expressions. If we expand its
+ *       usage, we should support aggregate/window functions as well.
+ */
+object RewriteWithExpression extends Rule[LogicalPlan] {
+  override def apply(plan: LogicalPlan): LogicalPlan = {
+    plan.transformWithPruning(_.containsPattern(WITH_EXPRESSION)) {
+      case p if p.expressions.exists(_.containsPattern(WITH_EXPRESSION)) =>
+        val commonExprs = mutable.ArrayBuffer.empty[Alias]
+        // `With` can be nested, we should only rewrite the leaf `With` expression, as the outer
+        // `With` needs to add its own Project, in the next iteration when it becomes leaf.
+        // This is done via "transform down" and check if the common expression definitions does not
+        // contain nested `With`.
+        var newPlan: LogicalPlan = p.transformExpressionsDown {
+          case With(child, defs) if defs.forall(!_.containsPattern(WITH_EXPRESSION)) =>
+            val idToCheapExpr = mutable.HashMap.empty[Long, Expression]
+            val idToNonCheapExpr = mutable.HashMap.empty[Long, Alias]
+            defs.foreach { commonExprDef =>
+              if (CollapseProject.isCheap(commonExprDef.child)) {
+                idToCheapExpr(commonExprDef.id) = commonExprDef.child
+              } else {
+                // TODO: we should calculate the ref count and also inline the common expression
+                //       if it's ref count is 1.
+                val alias = Alias(commonExprDef.child, s"_common_expr_${commonExprDef.id}")()
+                commonExprs += alias
+                idToNonCheapExpr(commonExprDef.id) = alias
+              }
+            }
+
+            child.transformWithPruning(_.containsPattern(COMMON_EXPR_REF)) {
+              case ref: CommonExpressionRef =>
+                idToCheapExpr.getOrElse(ref.id, idToNonCheapExpr(ref.id).toAttribute)
+            }
+        }
+
+        var exprsToAdd = commonExprs.toSeq
+        val newChildren = p.children.map { child =>
+          val (newExprs, others) = exprsToAdd.partition(_.references.subsetOf(child.outputSet))
+          exprsToAdd = others
+          if (newExprs.nonEmpty) {
+            Project(child.output ++ newExprs, child)
+          } else {
+            child
+          }
+        }
+
+        if (exprsToAdd.nonEmpty) {
+          // If we cannot rewrite the common expressions, force to inline them so that the query

Review Comment:
   When we cannot rewrite them?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1790371025

   > Why not add an optimizer rule to find the common expressions and insert a Project to pre-evaluate?
   
   We need this `With` expression anyway to support the WITH syntax in google bigquery, and it's safer to use it incrementally to avoid expression duplication. It's also easier to build it incrementally as the agg/window function support can be added later.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1790362905

   Why not add an optimizer rule to find the common expressions and insert a Project to pre-evaluate?
   It seems this PR need we update each expression with `With` expression one by one.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1799203571

   The doc generation issue is unrelated to my PR
   ```
   ImportError: Warning: Latest version of pandas (2.1.2) is required to generate the documentation; however, your version was 2.1.1
   ```
   
   I think we need to upgrade pandas version on GA machines. cc @HyukjinKwon @LuciferYang 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #43623:
URL: https://github.com/apache/spark/pull/43623#discussion_r1379693227


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/RewriteWithExpression.scala:
##########
@@ -0,0 +1,99 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.optimizer
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeMap, CommonExpressionRef, Expression, With}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, Project}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.trees.TreePattern.{COMMON_EXPR_REF, WITH_EXPRESSION}
+
+/**
+ * Rewrites the `With` expressions by adding a `Project` to pre-evaluate the common expressions, or
+ * just inline them if they are cheap.
+ *
+ * Note: For now we only use `With` in a few `RuntimeReplaceable` expressions. If we expand its
+ *       usage, we should support aggregate/window functions as well.
+ */
+object RewriteWithExpression extends Rule[LogicalPlan] {
+  override def apply(plan: LogicalPlan): LogicalPlan = {
+    plan.transformWithPruning(_.containsPattern(WITH_EXPRESSION)) {
+      case p if p.expressions.exists(_.containsPattern(WITH_EXPRESSION)) =>
+        val commonExprs = mutable.ArrayBuffer.empty[Alias]
+        // `With` can be nested, we should only rewrite the leaf `With` expression, as the outer
+        // `With` needs to add its own Project, in the next iteration when it becomes leaf.
+        // This is done via "transform down" and check if the common expression definitions does not
+        // contain nested `With`.
+        var newPlan: LogicalPlan = p.transformExpressionsDown {
+          case With(child, defs) if defs.forall(!_.containsPattern(WITH_EXPRESSION)) =>
+            val idToCheapExpr = mutable.HashMap.empty[Long, Expression]
+            val idToNonCheapExpr = mutable.HashMap.empty[Long, Alias]
+            defs.foreach { commonExprDef =>
+              if (CollapseProject.isCheap(commonExprDef.child)) {
+                idToCheapExpr(commonExprDef.id) = commonExprDef.child
+              } else {
+                // TODO: we should calculate the ref count and also inline the common expression
+                //       if it's ref count is 1.
+                val alias = Alias(commonExprDef.child, s"_common_expr_${commonExprDef.id}")()
+                commonExprs += alias
+                idToNonCheapExpr(commonExprDef.id) = alias
+              }
+            }
+
+            child.transformWithPruning(_.containsPattern(COMMON_EXPR_REF)) {
+              case ref: CommonExpressionRef =>
+                idToCheapExpr.getOrElse(ref.id, idToNonCheapExpr(ref.id).toAttribute)
+            }
+        }
+
+        var exprsToAdd = commonExprs.toSeq
+        val newChildren = p.children.map { child =>

Review Comment:
   it's the same, the new plan was produced by `transformExpressionsDown` so the children won't change.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1799204354

   I'm merging it to master, thanks for the reviews!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan closed pull request #43623: [SPARK-45760][SQL] Add With expression to avoid duplicating expressions
URL: https://github.com/apache/spark/pull/43623


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "wangyum (via GitHub)" <gi...@apache.org>.
wangyum commented on code in PR #43623:
URL: https://github.com/apache/spark/pull/43623#discussion_r1380292748


##########
connector/connect/server/src/test/scala/org/apache/spark/sql/connect/ProtoToParsedPlanTestSuite.scala:
##########
@@ -181,8 +183,15 @@ class ProtoToParsedPlanTestSuite
       val planner = new SparkConnectPlanner(SessionHolder.forTesting(spark))
       val catalystPlan =
         analyzer.executeAndCheck(planner.transformRelation(relation), new QueryPlanningTracker)
-      val actual =
-        removeMemoryAddress(normalizeExprIds(ReplaceExpressions(catalystPlan)).treeString)
+      val finalAnalyzedPlan = {
+        object Helper extends RuleExecutor[LogicalPlan] {
+          val batches =
+            Batch("Finish Analysis", Once, ReplaceExpressions) ::
+            Batch("Rewrite With expression", FixedPoint(10), RewriteWithExpression) :: Nil

Review Comment:
   Please fix the Scala style.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "peter-toth (via GitHub)" <gi...@apache.org>.
peter-toth commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1793716973

   Looks good to me, but I think we can make the new rule idempotent with a small refactor: https://github.com/cloud-fan/spark/pull/19


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "Kimahriman (via GitHub)" <gi...@apache.org>.
Kimahriman commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1793739993

   What's the longer term goal with this, especially in the context of all the attempts to add CSE to conditional expressions? Is the idea to make a `with` Column expression that can be manually used? Honestly when I started with Spark I assumed this was the default behavior if you re-used a Column object multiple times in an expression. 
   
   Just out of curiosity, since this is basically manual CSE at the optimizer stage, would it make sense to do the existing fully recursive, automatic CSE at the optimizer stage instead of the physical stage to achieve a similar affect?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1794079004

   > What's the longer term goal with this, especially in the context of all the attempts to add CSE to conditional expressions? 
   
   I think the final state should be implementing CSE at the logical plan level, so that it works for both codegen backend and native (vectorized) backend. But we still have gaps now. The `With` expression only works for operators that can add extra projects. It doesn't work for join conditions, while codegen is more flexible. Also codegen CSE is more adaptive and can be used for conditional expressions, but `With` is static. Before filling the gaps, `With` can't replace the current codegen CSE.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "viirya (via GitHub)" <gi...@apache.org>.
viirya commented on code in PR #43623:
URL: https://github.com/apache/spark/pull/43623#discussion_r1379248105


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/RewriteWithExpression.scala:
##########
@@ -0,0 +1,99 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.optimizer
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeMap, CommonExpressionRef, Expression, With}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, Project}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.trees.TreePattern.{COMMON_EXPR_REF, WITH_EXPRESSION}
+
+/**
+ * Rewrites the `With` expressions by adding a `Project` to pre-evaluate the common expressions, or
+ * just inline them if they are cheap.
+ *
+ * Note: For now we only use `With` in a few `RuntimeReplaceable` expressions. If we expand its
+ *       usage, we should support aggregate/window functions as well.
+ */
+object RewriteWithExpression extends Rule[LogicalPlan] {
+  override def apply(plan: LogicalPlan): LogicalPlan = {
+    plan.transformWithPruning(_.containsPattern(WITH_EXPRESSION)) {
+      case p if p.expressions.exists(_.containsPattern(WITH_EXPRESSION)) =>
+        val commonExprs = mutable.ArrayBuffer.empty[Alias]
+        // `With` can be nested, we should only rewrite the leaf `With` expression, as the outer
+        // `With` needs to add its own Project, in the next iteration when it becomes leaf.
+        // This is done via "transform down" and check if the common expression definitions does not
+        // contain nested `With`.
+        var newPlan: LogicalPlan = p.transformExpressionsDown {
+          case With(child, defs) if defs.forall(!_.containsPattern(WITH_EXPRESSION)) =>
+            val idToCheapExpr = mutable.HashMap.empty[Long, Expression]
+            val idToNonCheapExpr = mutable.HashMap.empty[Long, Alias]
+            defs.foreach { commonExprDef =>
+              if (CollapseProject.isCheap(commonExprDef.child)) {
+                idToCheapExpr(commonExprDef.id) = commonExprDef.child
+              } else {
+                // TODO: we should calculate the ref count and also inline the common expression
+                //       if it's ref count is 1.
+                val alias = Alias(commonExprDef.child, s"_common_expr_${commonExprDef.id}")()
+                commonExprs += alias
+                idToNonCheapExpr(commonExprDef.id) = alias
+              }
+            }
+
+            child.transformWithPruning(_.containsPattern(COMMON_EXPR_REF)) {
+              case ref: CommonExpressionRef =>
+                idToCheapExpr.getOrElse(ref.id, idToNonCheapExpr(ref.id).toAttribute)
+            }
+        }
+
+        var exprsToAdd = commonExprs.toSeq
+        val newChildren = p.children.map { child =>

Review Comment:
   Shouldn't be?
   
   ```suggestion
           val newChildren = newPlan.children.map { child =>
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45760][SQL] Add With expression to avoid duplicating expressions [spark]

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #43623:
URL: https://github.com/apache/spark/pull/43623#issuecomment-1799214806

   > The doc generation issue is unrelated to my PR
   > 
   > ```
   > ImportError: Warning: Latest version of pandas (2.1.2) is required to generate the documentation; however, your version was 2.1.1
   > ```
   > 
   > I think we need to upgrade pandas version on GA machines. cc @HyukjinKwon @LuciferYang
   
   Already upgrade: https://github.com/apache/spark/pull/43689 :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org