You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hernan Vivani (JIRA)" <ji...@apache.org> on 2016/09/08 15:49:21 UTC

[jira] [Created] (SPARK-17452) Spark 2.0.0 is not supporting the "partition" keyword on a "describe" statement when using Hive Support

Hernan Vivani created SPARK-17452:
-------------------------------------

             Summary: Spark 2.0.0 is not supporting the "partition" keyword on a "describe" statement when using Hive Support
                 Key: SPARK-17452
                 URL: https://issues.apache.org/jira/browse/SPARK-17452
             Project: Spark
          Issue Type: New Feature
          Components: Build
    Affects Versions: 2.0.0
         Environment: Amazon EMR 5.0.0
            Reporter: Hernan Vivani


Changes introduced in Spark 2 are not supporting the "partition" keyword on a "describe" statement.

EMR 5 (Spark 2.0):
==================
scala> import org.apache.spark.sql.SparkSession
scala> val sess=SparkSession.builder().appName("test").enableHiveSupport().getOrCreate()

sess.sql("describe formatted page_view partition (dt='2008-06-08', country='AR')").show 
org.apache.spark.sql.catalyst.parser.ParseException:
Unsupported SQL statement
== SQL ==
describe formatted page_view partition (dt='2008-06-08', country='AR')
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:58)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
  at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
  at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
  ... 48 elided


Same statement is working fine on Spark 1.6.2 and Spark 1.5.2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org