You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2015/06/30 21:26:06 UTC

[jira] [Resolved] (SPARK-8628) Race condition in AbstractSparkSQLParser.parse

     [ https://issues.apache.org/jira/browse/SPARK-8628?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust resolved SPARK-8628.
-------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.0
                   1.4.2

Issue resolved by pull request 7015
[https://github.com/apache/spark/pull/7015]

> Race condition in AbstractSparkSQLParser.parse
> ----------------------------------------------
>
>                 Key: SPARK-8628
>                 URL: https://issues.apache.org/jira/browse/SPARK-8628
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0, 1.3.1, 1.4.0
>            Reporter: Santiago M. Mola
>            Priority: Critical
>              Labels: regression
>             Fix For: 1.4.2, 1.5.0
>
>
> SPARK-5009 introduced the following code in AbstractSparkSQLParser:
> {code}
> def parse(input: String): LogicalPlan = {
>     // Initialize the Keywords.
>     lexical.initialize(reservedWords)
>     phrase(start)(new lexical.Scanner(input)) match {
>       case Success(plan, _) => plan
>       case failureOrError => sys.error(failureOrError.toString)
>     }
>   }
> {code}
> The corresponding initialize method in SqlLexical is not thread-safe:
> {code}
>   /* This is a work around to support the lazy setting */
>   def initialize(keywords: Seq[String]): Unit = {
>     reserved.clear()
>     reserved ++= keywords
>   }
> {code}
> I'm hitting this when parsing multiple SQL queries concurrently. When one query parsing starts, it empties the reserved keyword list, then a race-condition occurs and other queries fail to parse because they recognize keywords as identifiers.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org