You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/02/10 04:06:34 UTC

[GitHub] [spark] amaliujia opened a new pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

amaliujia opened a new pull request #35472:
URL: https://github.com/apache/spark/pull/35472


   
   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   
   Fix ThriftServerWithSparkContextInHttpSuite when ANSI mode is on.
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   To help enable SQL `ANSI` mode by default.
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   No. This is testing only change.
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   UT


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia edited a comment on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
amaliujia edited a comment on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035920384


   > @amaliujia I just tried running the failed test. I think it would be better to change to query instead, e.g
   > 
   > ```
   > select abs(date'2021-01-1')
   > ```
   > 
   > This query throws the same exception with/without ANSI mode on.
   
   I see. Interesting. So the goal is to avoid issues brought by enabling ANSI mode on such non-core tests (thus changing query is ok). If so then the change will be easier to make.
   
   
   > Also, it would be better to combine the fix of one or two more test suites together in one PR.
   
   If we can change queries on Thrift server related tests, then combing multiple fixes into one PR wouldn't too bad.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
amaliujia commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035863555


   > > @amaliujia We will have one new jobs for ANSI mode on. So, I think we need to read the conf value of ANSI mode and run different checks, instead of running both scenarios in one test case.
   > 
   > @gengliangwang I think a unit test should test both ANSI on and ANSI off behaviors regardless of whether we have a new CI job or not, unless the unit test is testing against a particular ANSI behavior.
   
   @gengliangwang please let me know your opinion. I can make appropriate change if we have a consensus here.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035234857


   @gengliangwang 
   
   I updated the test case to address your comments


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a change in pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
HyukjinKwon commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r804307464



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)

Review comment:
       Is this change related? I think seems unnecessary 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035105961


   @gengliangwang is there an example of how to read the conf value? Or should I wait until the new job start then continue fixing this test?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a change in pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
HyukjinKwon commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r804355364



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)

Review comment:
       Ah, okay. It was to deduplicate. I think it's fine to don't do this as it's one liner, and making it partial function is less readable IMO. But I am fine.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia edited a comment on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia edited a comment on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1034471988


   R: @gengliangwang @allisonwang-db 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1063832762






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r803985761



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(!e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "22023")
+      }
     }
 
     withJdbcStatement { statement =>
       val e = intercept[SQLException] {
         statement.executeQuery(sql)
       }
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))
-      assert(e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+
+      if (ansi) {
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))

Review comment:
       This piece is actually non ANSI thus existing code. Do you think we should improve existing code?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1034542588


   @amaliujia We will have one new jobs for ANSI mode on. So, I think we need to read the conf value of ANSI mode and run different checks, instead of running both scenarios in one test case.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
amaliujia commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035920384


   > @amaliujia I just tried running the failed test. I think it would be better to change to query instead, e.g
   > 
   > ```
   > select abs(date'2021-01-1')
   > ```
   > 
   > This query throws the same exception with/without ANSI mode on.
   
   I see. Interesting. So the goal is to avoid issues brought by enabling ANSI mode on such non-core tests (thus changing query is ok). If so then the change will be easier to make.
   
   
   > Also, it would be better to combine the fix of one or two more test suites together in one PR.
   If we can change queries on Thrift server related tests, then combing multiple fixes into one PR wouldn't too bad.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035130262


   BTW, how many test suites failed under hive-thriftserver? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] AmplabJenkins commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
AmplabJenkins commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1036534510


   Can one of the admins verify this patch?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1034471988


   R: @gengliangwang 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035128435


   @amaliujia `SQLConf.get.ansiEnabled`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on a change in pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
amaliujia commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r804350291



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage

Review comment:
       @allisonwang-db as this PR is just fix test case. If we have concern on the error, we can log a JIRA to further investigate but not coupled with this task. What do you think?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r804186174



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage

Review comment:
       I took a closer look and saw that for ANSI mode, the second argument of the `date_sub` is wrapped by a `ansi_cast`. Does this cause the error message change?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r803336147



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -61,22 +61,32 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(s"set ${SQLConf.ANSI_ENABLED.key}=false")
+        exec(sql)
       }
 
       assert(e.getMessage
         .contains("The second argument of 'date_sub' function needs to be an integer."))
       assert(!e.getMessage.contains("" +
         "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
       assert(e.getSQLState == "22023")
+
+      val e1 = intercept[HiveSQLException] {
+        exec(s"set ${SQLConf.ANSI_ENABLED.key}=true")
+        exec(sql)
+      }
+
+      assert(e.getMessage

Review comment:
       Shall we check e1 instead?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035134616


   @gengliangwang around 6. 2 are being fixed.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on a change in pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
amaliujia commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r804349859



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(!e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "22023")
+      }
     }
 
     withJdbcStatement { statement =>
       val e = intercept[SQLException] {
         statement.executeQuery(sql)
       }
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))
-      assert(e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+
+      if (ansi) {
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))

Review comment:
       I think because `e` here is `java.sql.SQLException`, it does not `getErrorClass`. Maybe that's why this code was written as what we see it now.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia closed pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
amaliujia closed pull request #35472:
URL: https://github.com/apache/spark/pull/35472


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
HyukjinKwon commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r804306013



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(!e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "22023")
+      }
     }
 
     withJdbcStatement { statement =>
       val e = intercept[SQLException] {
         statement.executeQuery(sql)
       }
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))
-      assert(e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+
+      if (ansi) {
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))

Review comment:
       Yeah, I think we can fix together while we're here.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] allisonwang-db commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
allisonwang-db commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r803969092



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage

Review comment:
       Is the error message change when ANSI is enabled intended?

##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(!e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "22023")
+      }
     }
 
     withJdbcStatement { statement =>
       val e = intercept[SQLException] {
         statement.executeQuery(sql)
       }
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))
-      assert(e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+
+      if (ansi) {
+        assert(e.getMessage.contains(
+          "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
+        assert(e.getSQLState == "42000")
+      } else {
+        assert(e.getMessage
+          .contains("The second argument of 'date_sub' function needs to be an integer."))
+        assert(e.getMessage.contains("[SECOND_FUNCTION_ARGUMENT_NOT_INTEGER]"))

Review comment:
       Maybe `e.getErrorClass`?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035881146


   @amaliujia I just tried running the failed test.
   I think it would be better to change to query instead, e.g
   ```
   select abs(date'2021-01-1')
   ```
   This query throws the same exception with/without ANSI mode on.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035881500


   Also, it would be better to combine the fix of one or two more test suites together in one PR.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on pull request #35472: [SPARK-38170][TESTS][SQL] Make ThriftServerWithSparkContextInHttpSuite pass with ANSI enabled

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035864329


   >> @gengliangwang I think a unit test should test both ANSI on and ANSI off behaviors regardless of whether we have a new CI job or not, unless the unit test is testing against a particular ANSI behavior.
   
   > @gengliangwang please let me know your opinion. I can make appropriate change if we have a consensus here.
   
   I see.  Either way should be fine then
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
amaliujia commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r803985416



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage

Review comment:
       Yes. ANSI changes the error message.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] allisonwang-db commented on a change in pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
allisonwang-db commented on a change in pull request #35472:
URL: https://github.com/apache/spark/pull/35472#discussion_r803999593



##########
File path: sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala
##########
@@ -57,34 +57,50 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
 
   test("Full stack traces as error message for jdbc or thrift client") {
     val sql = "select date_sub(date'2011-11-11', '1.2')"
+    val ansi: Boolean = SQLConf.get.getConf(SQLConf.ANSI_ENABLED)
+
     withCLIServiceClient() { client =>
       val sessionHandle = client.openSession(user, "")
 
       val confOverlay = new java.util.HashMap[java.lang.String, java.lang.String]
+      val exec: String => OperationHandle = client.executeStatement(sessionHandle, _, confOverlay)
+
       val e = intercept[HiveSQLException] {
-        client.executeStatement(
-          sessionHandle,
-          sql,
-          confOverlay)
+        exec(sql)
       }
 
-      assert(e.getMessage
-        .contains("The second argument of 'date_sub' function needs to be an integer."))
-      assert(!e.getMessage.contains("" +
-        "java.lang.NumberFormatException: invalid input syntax for type numeric: 1.2"))
-      assert(e.getSQLState == "22023")
+      if (ansi) {
+        assert(!e.getMessage

Review comment:
       I am curious why it got changed. This is not related to this PR but I think the original error message is better "The second argument of 'date_sub' function needs to be an integer."




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] allisonwang-db commented on pull request #35472: [SPARK-38170] Fix //sql/hive-thriftserver:org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInHttpSuite-hive-2.3__hadoop-2.7 in ANSI

Posted by GitBox <gi...@apache.org>.
allisonwang-db commented on pull request #35472:
URL: https://github.com/apache/spark/pull/35472#issuecomment-1035316678


   > @amaliujia We will have one new jobs for ANSI mode on. So, I think we need to read the conf value of ANSI mode and run different checks, instead of running both scenarios in one test case.
   
   @gengliangwang I think a unit test should test both ANSI on and ANSI off behaviors regardless of whether we have a new CI job or not, unless the unit test is testing against a particular ANSI behavior.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org