You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "kazuyukitanimura (via GitHub)" <gi...@apache.org> on 2023/11/16 23:50:53 UTC

[PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

kazuyukitanimura opened a new pull request, #43853:
URL: https://github.com/apache/spark/pull/43853

   ### What changes were proposed in this pull request?
   This follow-up PR fixes a test for SPARK-45786 that is failing in GHA with SPARK_ANSI_SQL_MODE=true
   
   
   ### Why are the changes needed?
   The issue discovered in https://github.com/apache/spark/pull/43678#discussion_r1395693417
   
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   Test updated
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "kazuyukitanimura (via GitHub)" <gi...@apache.org>.
kazuyukitanimura commented on code in PR #43853:
URL: https://github.com/apache/spark/pull/43853#discussion_r1396794780


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala:
##########
@@ -308,25 +308,38 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
           val mulResult = Decimal(mulExact.setScale(mulType.scale, RoundingMode.HALF_UP))
           val mulExpected =
             if (mulResult.precision > DecimalType.MAX_PRECISION) null else mulResult
-          checkEvaluation(mulActual, mulExpected)
+          tryCheckEvaluation(mulActual, mulExpected)
 
           val divType = Divide(null, null).resultDecimalType(p1, s1, p2, s2)
           val divResult = Decimal(divExact.setScale(divType.scale, RoundingMode.HALF_UP))
           val divExpected =
             if (divResult.precision > DecimalType.MAX_PRECISION) null else divResult
-          checkEvaluation(divActual, divExpected)
+          tryCheckEvaluation(divActual, divExpected)
 
           val remType = Remainder(null, null).resultDecimalType(p1, s1, p2, s2)
           val remResult = Decimal(remExact.setScale(remType.scale, RoundingMode.HALF_UP))
           val remExpected =
             if (remResult.precision > DecimalType.MAX_PRECISION) null else remResult
-          checkEvaluation(remActual, remExpected)
+          tryCheckEvaluation(remActual, remExpected)
 
           val quotType = IntegralDivide(null, null).resultDecimalType(p1, s1, p2, s2)
           val quotResult = Decimal(quotExact.setScale(quotType.scale, RoundingMode.HALF_UP))
           val quotExpected =
             if (quotResult.precision > DecimalType.MAX_PRECISION) null else quotResult
-          checkEvaluation(quotActual, quotExpected.toLong)
+          tryCheckEvaluation(quotActual, quotExpected.toLong)
+        }
+      }
+
+      def tryCheckEvaluation(actual: BinaryArithmetic, expected: Any): Unit = {
+        try {
+          checkEvaluation(actual, expected)
+        }
+        catch {
+          // Ignore NUMERIC_VALUE_OUT_OF_RANGE when ANSI is enabled
+          case e: org.scalatest.exceptions.TestFailedException
+            if e.cause.exists(c => c.isInstanceOf[SparkArithmeticException] &&
+              c.asInstanceOf[SparkArithmeticException].getErrorClass
+                == "NUMERIC_VALUE_OUT_OF_RANGE") && SQLConf.get.ansiEnabled =>

Review Comment:
   @dongjoon-hyun
   I realized there is a better way to write this test and updated
   >  Does it mean previous ANSI mode should throws this exception in this test case?
   
   Yes, it is always the case
   
   >  I'm wondering why the corrected values are a out of range after
   
   Corrected values are null in non-Ansi mode. Exception for Ansi mode



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on PR #43853:
URL: https://github.com/apache/spark/pull/43853#issuecomment-1816147713

   Merged to master/3.5/3.4. Thank you, @kazuyukitanimura and @LuciferYang .


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "kazuyukitanimura (via GitHub)" <gi...@apache.org>.
kazuyukitanimura commented on PR #43853:
URL: https://github.com/apache/spark/pull/43853#issuecomment-1816361398

   Thank you all


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "kazuyukitanimura (via GitHub)" <gi...@apache.org>.
kazuyukitanimura commented on code in PR #43853:
URL: https://github.com/apache/spark/pull/43853#discussion_r1396777531


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala:
##########
@@ -308,25 +308,38 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
           val mulResult = Decimal(mulExact.setScale(mulType.scale, RoundingMode.HALF_UP))
           val mulExpected =
             if (mulResult.precision > DecimalType.MAX_PRECISION) null else mulResult
-          checkEvaluation(mulActual, mulExpected)
+          tryCheckEvaluation(mulActual, mulExpected)
 
           val divType = Divide(null, null).resultDecimalType(p1, s1, p2, s2)
           val divResult = Decimal(divExact.setScale(divType.scale, RoundingMode.HALF_UP))
           val divExpected =
             if (divResult.precision > DecimalType.MAX_PRECISION) null else divResult
-          checkEvaluation(divActual, divExpected)
+          tryCheckEvaluation(divActual, divExpected)
 
           val remType = Remainder(null, null).resultDecimalType(p1, s1, p2, s2)
           val remResult = Decimal(remExact.setScale(remType.scale, RoundingMode.HALF_UP))
           val remExpected =
             if (remResult.precision > DecimalType.MAX_PRECISION) null else remResult
-          checkEvaluation(remActual, remExpected)
+          tryCheckEvaluation(remActual, remExpected)
 
           val quotType = IntegralDivide(null, null).resultDecimalType(p1, s1, p2, s2)
           val quotResult = Decimal(quotExact.setScale(quotType.scale, RoundingMode.HALF_UP))
           val quotExpected =
             if (quotResult.precision > DecimalType.MAX_PRECISION) null else quotResult
-          checkEvaluation(quotActual, quotExpected.toLong)
+          tryCheckEvaluation(quotActual, quotExpected.toLong)
+        }
+      }
+
+      def tryCheckEvaluation(actual: BinaryArithmetic, expected: Any): Unit = {
+        try {
+          checkEvaluation(actual, expected)
+        }
+        catch {
+          // Ignore NUMERIC_VALUE_OUT_OF_RANGE when ANSI is enabled
+          case e: org.scalatest.exceptions.TestFailedException
+            if e.cause.exists(c => c.isInstanceOf[SparkArithmeticException] &&
+              c.asInstanceOf[SparkArithmeticException].getErrorClass
+                == "NUMERIC_VALUE_OUT_OF_RANGE") && SQLConf.get.ansiEnabled =>

Review Comment:
   Thank you @dongjoon-hyun @LuciferYang 
   No behavior change introduced by #43678 
   This test was assuming the default spark.sql.ansi.enabled=false. The default behavior does not throw the exception for overflows, but Ansi mode does. Since this is a random value test, we may have combinations that overflows.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #43853:
URL: https://github.com/apache/spark/pull/43853#discussion_r1396685343


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala:
##########
@@ -308,25 +308,38 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
           val mulResult = Decimal(mulExact.setScale(mulType.scale, RoundingMode.HALF_UP))
           val mulExpected =
             if (mulResult.precision > DecimalType.MAX_PRECISION) null else mulResult
-          checkEvaluation(mulActual, mulExpected)
+          tryCheckEvaluation(mulActual, mulExpected)
 
           val divType = Divide(null, null).resultDecimalType(p1, s1, p2, s2)
           val divResult = Decimal(divExact.setScale(divType.scale, RoundingMode.HALF_UP))
           val divExpected =
             if (divResult.precision > DecimalType.MAX_PRECISION) null else divResult
-          checkEvaluation(divActual, divExpected)
+          tryCheckEvaluation(divActual, divExpected)
 
           val remType = Remainder(null, null).resultDecimalType(p1, s1, p2, s2)
           val remResult = Decimal(remExact.setScale(remType.scale, RoundingMode.HALF_UP))
           val remExpected =
             if (remResult.precision > DecimalType.MAX_PRECISION) null else remResult
-          checkEvaluation(remActual, remExpected)
+          tryCheckEvaluation(remActual, remExpected)
 
           val quotType = IntegralDivide(null, null).resultDecimalType(p1, s1, p2, s2)
           val quotResult = Decimal(quotExact.setScale(quotType.scale, RoundingMode.HALF_UP))
           val quotExpected =
             if (quotResult.precision > DecimalType.MAX_PRECISION) null else quotResult
-          checkEvaluation(quotActual, quotExpected.toLong)
+          tryCheckEvaluation(quotActual, quotExpected.toLong)
+        }
+      }
+
+      def tryCheckEvaluation(actual: BinaryArithmetic, expected: Any): Unit = {
+        try {
+          checkEvaluation(actual, expected)
+        }
+        catch {
+          // Ignore NUMERIC_VALUE_OUT_OF_RANGE when ANSI is enabled
+          case e: org.scalatest.exceptions.TestFailedException
+            if e.cause.exists(c => c.isInstanceOf[SparkArithmeticException] &&
+              c.asInstanceOf[SparkArithmeticException].getErrorClass
+                == "NUMERIC_VALUE_OUT_OF_RANGE") && SQLConf.get.ansiEnabled =>

Review Comment:
   +1, for this question



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on PR #43853:
URL: https://github.com/apache/spark/pull/43853#issuecomment-1815858614

   Thank you for updating, @kazuyukitanimura . Pending CIs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun closed pull request #43853: [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled
URL: https://github.com/apache/spark/pull/43853


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on code in PR #43853:
URL: https://github.com/apache/spark/pull/43853#discussion_r1396582499


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala:
##########
@@ -308,25 +308,38 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
           val mulResult = Decimal(mulExact.setScale(mulType.scale, RoundingMode.HALF_UP))
           val mulExpected =
             if (mulResult.precision > DecimalType.MAX_PRECISION) null else mulResult
-          checkEvaluation(mulActual, mulExpected)
+          tryCheckEvaluation(mulActual, mulExpected)
 
           val divType = Divide(null, null).resultDecimalType(p1, s1, p2, s2)
           val divResult = Decimal(divExact.setScale(divType.scale, RoundingMode.HALF_UP))
           val divExpected =
             if (divResult.precision > DecimalType.MAX_PRECISION) null else divResult
-          checkEvaluation(divActual, divExpected)
+          tryCheckEvaluation(divActual, divExpected)
 
           val remType = Remainder(null, null).resultDecimalType(p1, s1, p2, s2)
           val remResult = Decimal(remExact.setScale(remType.scale, RoundingMode.HALF_UP))
           val remExpected =
             if (remResult.precision > DecimalType.MAX_PRECISION) null else remResult
-          checkEvaluation(remActual, remExpected)
+          tryCheckEvaluation(remActual, remExpected)
 
           val quotType = IntegralDivide(null, null).resultDecimalType(p1, s1, p2, s2)
           val quotResult = Decimal(quotExact.setScale(quotType.scale, RoundingMode.HALF_UP))
           val quotExpected =
             if (quotResult.precision > DecimalType.MAX_PRECISION) null else quotResult
-          checkEvaluation(quotActual, quotExpected.toLong)
+          tryCheckEvaluation(quotActual, quotExpected.toLong)
+        }
+      }
+
+      def tryCheckEvaluation(actual: BinaryArithmetic, expected: Any): Unit = {
+        try {
+          checkEvaluation(actual, expected)
+        }
+        catch {
+          // Ignore NUMERIC_VALUE_OUT_OF_RANGE when ANSI is enabled
+          case e: org.scalatest.exceptions.TestFailedException
+            if e.cause.exists(c => c.isInstanceOf[SparkArithmeticException] &&
+              c.asInstanceOf[SparkArithmeticException].getErrorClass
+                == "NUMERIC_VALUE_OUT_OF_RANGE") && SQLConf.get.ansiEnabled =>

Review Comment:
   Oh, does this mean a behavior change in ANSI mode, @kazuyukitanimura ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45786][SQL][FOLLOWUP][TEST] Fix Decimal random number tests with ANSI enabled [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on code in PR #43853:
URL: https://github.com/apache/spark/pull/43853#discussion_r1396787960


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ArithmeticExpressionSuite.scala:
##########
@@ -308,25 +308,38 @@ class ArithmeticExpressionSuite extends SparkFunSuite with ExpressionEvalHelper
           val mulResult = Decimal(mulExact.setScale(mulType.scale, RoundingMode.HALF_UP))
           val mulExpected =
             if (mulResult.precision > DecimalType.MAX_PRECISION) null else mulResult
-          checkEvaluation(mulActual, mulExpected)
+          tryCheckEvaluation(mulActual, mulExpected)
 
           val divType = Divide(null, null).resultDecimalType(p1, s1, p2, s2)
           val divResult = Decimal(divExact.setScale(divType.scale, RoundingMode.HALF_UP))
           val divExpected =
             if (divResult.precision > DecimalType.MAX_PRECISION) null else divResult
-          checkEvaluation(divActual, divExpected)
+          tryCheckEvaluation(divActual, divExpected)
 
           val remType = Remainder(null, null).resultDecimalType(p1, s1, p2, s2)
           val remResult = Decimal(remExact.setScale(remType.scale, RoundingMode.HALF_UP))
           val remExpected =
             if (remResult.precision > DecimalType.MAX_PRECISION) null else remResult
-          checkEvaluation(remActual, remExpected)
+          tryCheckEvaluation(remActual, remExpected)
 
           val quotType = IntegralDivide(null, null).resultDecimalType(p1, s1, p2, s2)
           val quotResult = Decimal(quotExact.setScale(quotType.scale, RoundingMode.HALF_UP))
           val quotExpected =
             if (quotResult.precision > DecimalType.MAX_PRECISION) null else quotResult
-          checkEvaluation(quotActual, quotExpected.toLong)
+          tryCheckEvaluation(quotActual, quotExpected.toLong)
+        }
+      }
+
+      def tryCheckEvaluation(actual: BinaryArithmetic, expected: Any): Unit = {
+        try {
+          checkEvaluation(actual, expected)
+        }
+        catch {
+          // Ignore NUMERIC_VALUE_OUT_OF_RANGE when ANSI is enabled
+          case e: org.scalatest.exceptions.TestFailedException
+            if e.cause.exists(c => c.isInstanceOf[SparkArithmeticException] &&
+              c.asInstanceOf[SparkArithmeticException].getErrorClass
+                == "NUMERIC_VALUE_OUT_OF_RANGE") && SQLConf.get.ansiEnabled =>

Review Comment:
   To @kazuyukitanimura ,
   - Does it mean previous ANSI mode should throws this exception in this test case?
   - In other words, I'm wondering why the corrected values are a out of range after #43678 .
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org