You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Javier Fuentes (Jira)" <ji...@apache.org> on 2019/12/18 12:42:01 UTC

[jira] [Comment Edited] (SPARK-30049) SQL fails to parse when comment contains an unmatched quote character

    [ https://issues.apache.org/jira/browse/SPARK-30049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16999116#comment-16999116 ] 

Javier Fuentes edited comment on SPARK-30049 at 12/18/19 12:41 PM:
-------------------------------------------------------------------

That issue seems to be fixed in SPARK-30295 [~yumwang] 

!Screen Shot 2019-12-18 at 9.26.29 AM.png!


was (Author: javier_ivanov):
That issue seems to be fixed in SPARK-30295 [~yumwang] 

> SQL fails to parse when comment contains an unmatched quote character
> ---------------------------------------------------------------------
>
>                 Key: SPARK-30049
>                 URL: https://issues.apache.org/jira/browse/SPARK-30049
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Jason Darrell Lowe
>            Priority: Major
>         Attachments: Screen Shot 2019-12-18 at 9.26.29 AM.png
>
>
> A SQL statement that contains a comment with an unmatched quote character can lead to a parse error.  These queries parsed correctly in older versions of Spark.  For example, here's an excerpt from an interactive spark-sql session on a recent Spark-3.0.0-SNAPSHOT build (commit e23c135e568d4401a5659bc1b5ae8fc8bf147693):
> {noformat}
> spark-sql> SELECT 1 -- someone's comment here
>          > ;
> Error in query: 
> extraneous input ';' expecting <EOF>(line 2, pos 0)
> == SQL ==
> SELECT 1 -- someone's comment here
> ;
> ^^^
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org