You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2015/04/15 22:00:59 UTC

[jira] [Resolved] (SPARK-6730) Can't have table as identifier in OPTIONS

     [ https://issues.apache.org/jira/browse/SPARK-6730?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust resolved SPARK-6730.
-------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

Issue resolved by pull request 5520
[https://github.com/apache/spark/pull/5520]

> Can't have table as identifier in OPTIONS
> -----------------------------------------
>
>                 Key: SPARK-6730
>                 URL: https://issues.apache.org/jira/browse/SPARK-6730
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0
>            Reporter: Alex Liu
>             Fix For: 1.4.0
>
>
> The following query fails because there is an  identifier "table" in OPTIONS
> {code}
> CREATE TEMPORARY TABLE ddlTable
> USING org.apache.spark.sql.cassandra
> OPTIONS (
>  table "test1",
>  keyspace "test"
> )
> {code} 
> The following error
> {code}
> ]   java.lang.RuntimeException: [1.2] failure: ``insert'' expected but identifier CREATE found
> [info] 
> [info]  CREATE TEMPORARY TABLE ddlTable USING org.apache.spark.sql.cassandra OPTIONS (  table "test1",  keyspace "dstest"  )       
> [info]  ^
> [info]   at scala.sys.package$.error(package.scala:27)
> [info]   at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:40)
> [info]   at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:130)
> [info]   at org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:130)
> [info]   at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
> [info]   at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
> [info]   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
> [info]   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
> [info]   at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
> [info]   at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
> [info]   at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> [info]   at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
> [info]   at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
> [info]   at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
> [info]   at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
> [info]   at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
> [info]   at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> [info]   at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
> [info]   at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
> [info]   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> [info]   at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
> [info]   at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
> [info]   at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSparkSQLParser.scala:38)
> [info]   at org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:134)
> [info]   at org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.scala:134)
> [info]   at scala.Option.getOrElse(Option.scala:120)
> [info]   at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:134)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org