You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/11/03 04:29:49 UTC

[GitHub] [spark] itholic commented on a diff in pull request #38447: [SPARK-40973][SQL] Rename `_LEGACY_ERROR_TEMP_0055` to `UNCLOSED_BRACKETED_COMMENT`

itholic commented on code in PR #38447:
URL: https://github.com/apache/spark/pull/38447#discussion_r1012485332


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryParsingErrors.scala:
##########
@@ -608,8 +608,12 @@ private[sql] object QueryParsingErrors extends QueryErrorsBase {
   }
 
   def unclosedBracketedCommentError(command: String, position: Origin): Throwable = {
-    new ParseException(Some(command), "Unclosed bracketed comment", position, position,
-      Some("_LEGACY_ERROR_TEMP_0055"))
+    new ParseException(
+      Some(command),
+      "Found an unclosed bracketed comment. Please, append */ at the end of the comment.",

Review Comment:
   I think maybe If we don't specify the message here, then `null` is returned so the `CliSuite.SPARK-37555: spark-sql should pass last unclosed comment to backend` failed as below:
   
   ```
   CliSuite.SPARK-37555: spark-sql should pass last unclosed comment to backend
   org.scalatest.exceptions.TestFailedException: 
   =======================
   CliSuite failure output
   =======================
   Spark SQL CLI command line: ../../bin/spark-sql --master local --driver-java-options -Dderby.system.durability=test --conf spark.ui.enabled=false --hiveconf javax.jdo.option.ConnectionURL=jdbc:derby:;databaseName=/home/runner/work/spark/spark/target/tmp/spark-346cf558-8a95-43f1-b893-81f6a5571f3f;create=true --hiveconf hive.exec.scratchdir=/home/runner/work/spark/spark/target/tmp/spark-f108bccb-3f71-4854-aa70-c18ea721617c --hiveconf conf1=conftest --hiveconf conf2=1 --hiveconf hive.metastore.warehouse.dir=/home/runner/work/spark/spark/target/tmp/spark-86d39e14-55f9-437b-a676-7a5cabd58904
   Exception: java.util.concurrent.TimeoutException: Futures timed out after [2 minutes]
   Failed to capture next expected output "Found an unclosed bracketed comment." within 2 minutes.
   
   2022-10-31 19:36:30.434 - stderr> Setting default log level to "WARN".
   2022-10-31 19:36:30.434 - stderr> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
   2022-10-31 19:36:41.979 - stderr> Spark master: local, Application Id: local-1667270193279
   2022-10-31 19:36:43.351 - stdout> spark-sql> /* SELECT /*+ HINT() 4; */;
   2022-10-31 19:36:44.119 - stderr> 
   2022-10-31 19:36:44.119 - stderr> [PARSE_SYNTAX_ERROR] Syntax error at or near ';'(line 1, pos 26)
   2022-10-31 19:36:44.119 - stderr> 
   2022-10-31 19:36:44.119 - stderr> == SQL ==
   2022-10-31 19:36:44.119 - stderr> /* SELECT /*+ HINT() 4; */;
   2022-10-31 19:36:44.119 - stderr> --------------------------^^^
   2022-10-31 19:36:44.119 - stderr> 
   2022-10-31 19:36:44.141 - stdout> spark-sql> /* SELECT /*+ HINT() 4; */ SELECT 1;
   2022-10-31 19:36:48.341 - stdout> 1
   2022-10-31 19:36:48.341 - stderr> Time taken: 4.196 seconds, Fetched 1 row(s)
   2022-10-31 19:36:48.352 - stdout> spark-sql> /* Here is a unclosed bracketed comment SELECT 1;
   2022-10-31 19:36:48.361 - stderr> 
   2022-10-31 19:36:48.361 - stderr> null(line 1, pos 0)
   ```
   
   I think maybe we might need to introduce new `SparkException` for `ParseException` such as `SparkParseException` ??
   
   So, can we keep it as is for now, and remove the duplicated message when `SparkParseException` is introduced ??



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org