You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "ulysses-you (via GitHub)" <gi...@apache.org> on 2023/05/05 01:56:50 UTC

[GitHub] [spark] ulysses-you commented on a diff in pull request #41046: [SPARK-43376][SQL] Improve reuse subquery with table cache

ulysses-you commented on code in PR #41046:
URL: https://github.com/apache/spark/pull/41046#discussion_r1185645621


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/ReuseAdaptiveSubquery.scala:
##########
@@ -33,10 +33,13 @@ case class ReuseAdaptiveSubquery(
 
     plan.transformAllExpressionsWithPruning(_.containsPattern(PLAN_EXPRESSION)) {
       case sub: ExecSubqueryExpression =>
-        val newPlan = reuseMap.getOrElseUpdate(sub.plan.canonicalized, sub.plan)
-        if (newPlan.ne(sub.plan)) {
-          sub.withNewPlan(ReusedSubqueryExec(newPlan))
-        } else {
+        // `InsertAdaptiveSparkPlan` compiles subquery for each exprId, then the java object
+        // is always `eq` if two subqueries have same exprId.
+        // Check if the subquery can be reused manually instead of call `getOrElseUpdate`.
+        reuseMap.get(sub.plan.canonicalized).map { subquery =>
+          sub.withNewPlan(ReusedSubqueryExec(subquery))
+        }.getOrElse {
+          reuseMap.put(sub.plan.canonicalized, sub.plan)

Review Comment:
   The behavior should be same with `getOrElseUpdate`. I changed `put` to `putIfAbsent` to fully equivalent with `getOrElseUpdate` that the first `put` will win if there is race condition.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org