You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/10/15 19:04:05 UTC

[jira] [Resolved] (SPARK-10217) Spark SQL cannot handle ordering directive in ORDER BY clauses with expressions

     [ https://issues.apache.org/jira/browse/SPARK-10217?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-10217.
-------------------------------
    Resolution: Cannot Reproduce

> Spark SQL cannot handle ordering directive in ORDER BY clauses with expressions
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-10217
>                 URL: https://issues.apache.org/jira/browse/SPARK-10217
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1
>         Environment: Ubuntu on AWS
>            Reporter: Simeon Simeonov
>              Labels: SQL, analyzers
>
> Spark SQL supports expressions in ORDER BY clauses, e.g.,
> {code}
> scala> sqlContext.sql("select cnt from test order by (cnt + cnt)")
> res2: org.apache.spark.sql.DataFrame = [cnt: bigint]
> {code}
> However, the analyzer gets confused when there is an explicit ordering directive (ASC/DESC):
> {code}
> scala> sqlContext.sql("select cnt from test order by (cnt + cnt) asc")
> 15/08/25 04:08:02 INFO ParseDriver: Parsing command: select cnt from test order by (cnt + cnt) asc
> org.apache.spark.sql.AnalysisException: extraneous input 'asc' expecting EOF near '<EOF>'; line 1 pos 40
> 	at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:289)
> 	at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:41)
> 	at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:40)
>         ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org