You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "MaxGekk (via GitHub)" <gi...@apache.org> on 2023/04/04 11:56:35 UTC

[GitHub] [spark] MaxGekk commented on a diff in pull request #40609: [SPARK-42316][SQL] Assign name to _LEGACY_ERROR_TEMP_2044

MaxGekk commented on code in PR #40609:
URL: https://github.com/apache/spark/pull/40609#discussion_r1157127026


##########
core/src/main/resources/error/error-classes.json:
##########
@@ -35,6 +35,12 @@
     ],
     "sqlState" : "22003"
   },
+  "BINARY_ARITHMETIC_CAUSE_OVERFLOW" : {

Review Comment:
   Let's make it shorter `BINARY_ARITHMETIC_OVERFLOW` similarly to other error class names in the file.



##########
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryExecutionErrorsSuite.scala:
##########
@@ -625,6 +625,21 @@ class QueryExecutionErrorsSuite
     }
   }
 
+  test("BINARY_ARITHMETIC_CAUSE_OVERFLOW: byte plus byte result overflow") {
+    withSQLConf("spark.sql.ansi.enabled" -> "true") {

Review Comment:
   ```suggestion
       withSQLConf(SQLConf.ANSI_ENABLED.key -> "true") {
   ```



##########
sql/core/src/test/scala/org/apache/spark/sql/errors/QueryExecutionErrorsSuite.scala:
##########
@@ -625,6 +625,21 @@ class QueryExecutionErrorsSuite
     }
   }
 
+  test("BINARY_ARITHMETIC_CAUSE_OVERFLOW: byte plus byte result overflow") {
+    withSQLConf("spark.sql.ansi.enabled" -> "true") {
+      checkError(
+        exception = intercept[SparkArithmeticException] {
+          sql(s"select CAST('127' AS TINYINT) + CAST('5' AS TINYINT)").collect()

Review Comment:
   You can create a TINYINT constant directly without casting:
   ```suggestion
             sql(s"select 127Y + 5Y").collect()
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org