You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/03/22 22:55:18 UTC

[GitHub] [spark] c21 commented on a change in pull request #31931: [SPARK-34707][SQL] Code-gen broadcast nested loop join (left outer/right outer)

c21 commented on a change in pull request #31931:
URL: https://github.com/apache/spark/pull/31931#discussion_r599128682



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/joins/BroadcastNestedLoopJoinExec.scala
##########
@@ -458,6 +460,41 @@ case class BroadcastNestedLoopJoinExec(
      """.stripMargin
   }
 
+  private def codegenOuter(ctx: CodegenContext, input: Seq[ExprCode]): String = {
+    val (_, buildRowArrayTerm) = prepareBroadcast(ctx)
+    val (buildRow, checkCondition, _) = getJoinCondition(ctx, input, streamed, broadcast)
+    val buildVars = genBuildSideVars(ctx, buildRow, broadcast)
+
+    val resultVars = buildSide match {
+      case BuildLeft => buildVars ++ input
+      case BuildRight => input ++ buildVars
+    }
+    val arrayIndex = ctx.freshName("arrayIndex")
+    val shouldOutputRow = ctx.freshName("shouldOutputRow")
+    val foundMatch = ctx.freshName("foundMatch")
+    val numOutput = metricTerm(ctx, "numOutputRows")
+
+    s"""
+       |boolean $foundMatch = false;
+       |for (int $arrayIndex = 0; $arrayIndex < $buildRowArrayTerm.length; $arrayIndex++) {

Review comment:
       What about the broadcast side to be empty? It seems not right here because we still need to output one row for streamed side.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org