You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/09/26 03:38:57 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #34097: [SPARK-36792][SQL][FOLLOWUP] Refactor InSet generated code

HyukjinKwon commented on a change in pull request #34097:
URL: https://github.com/apache/spark/pull/34097#discussion_r716133921



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/predicates.scala
##########
@@ -612,26 +612,28 @@ case class InSet(child: Expression, hset: Set[Any]) extends UnaryExpression with
         ""
       }
 
-      val ret = child.dataType match {
+      val isNaNCode = child.dataType match {
         case DoubleType => Some((v: Any) => s"java.lang.Double.isNaN($v)")
         case FloatType => Some((v: Any) => s"java.lang.Float.isNaN($v)")
         case _ => None
       }
 
-      ret.map { isNaN =>
-        s"""
-          |if ($setTerm.contains($c)) {
-          |  ${ev.value} = true;
-          |} else if (${isNaN(c)}) {
-          |  ${ev.value} =  $hasNaN;
-          |}
-          |$setIsNull
-          |""".stripMargin
-      }.getOrElse(
-        s"""
-           |${ev.value} = $setTerm.contains($c);
-           |$setIsNull
-         """.stripMargin)
+      hasNaN match {

Review comment:
       Can we just use if-else here? Also, let's file a separate JIRA. This is technically a performance improvement to avoid dispatching on nan per the values at in-set.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org