You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/03/31 18:48:25 UTC
[jira] [Resolved] (SPARK-12772) Better error message for syntax
error in the SQL parser
[ https://issues.apache.org/jira/browse/SPARK-12772?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin resolved SPARK-12772.
---------------------------------
Resolution: Fixed
Fix Version/s: 2.0.0
Closing this one as the error message is much better with the new antlr4 parser.
> Better error message for syntax error in the SQL parser
> -------------------------------------------------------
>
> Key: SPARK-12772
> URL: https://issues.apache.org/jira/browse/SPARK-12772
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Reynold Xin
> Fix For: 2.0.0
>
>
> {code}
> scala> sql("select case if(true, 'one', 'two')").explain(true)
> org.apache.spark.sql.AnalysisException: org.antlr.runtime.EarlyExitException
> line 1:34 required (...)+ loop did not match anything at input '<EOF>' in case expression
> ; line 1 pos 34
> at org.apache.spark.sql.catalyst.parser.ParseErrorReporter.throwError(ParseDriver.scala:140)
> at org.apache.spark.sql.catalyst.parser.ParseErrorReporter.throwError(ParseDriver.scala:129)
> at org.apache.spark.sql.catalyst.parser.ParseDriver$.parse(ParseDriver.scala:77)
> at org.apache.spark.sql.catalyst.CatalystQl.createPlan(CatalystQl.scala:53)
> at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
> at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
> {code}
> Is there a way to say something better other than "required (...)+ loop did not match anything at input"?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org