You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:13:26 UTC

[jira] [Resolved] (SPARK-13911) Having condition and order by cannot both have aggregate functions

     [ https://issues.apache.org/jira/browse/SPARK-13911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-13911.
----------------------------------
    Resolution: Incomplete

> Having condition and order by cannot both have aggregate functions
> ------------------------------------------------------------------
>
>                 Key: SPARK-13911
>                 URL: https://issues.apache.org/jira/browse/SPARK-13911
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.1, 1.4.1, 1.5.2, 1.6.1, 2.0.0, 2.2.0
>            Reporter: Cheng Lian
>            Priority: Major
>              Labels: bulk-closed
>
> Given the following temporary table:
> {code}
> sqlContext range 10 select ('id as 'a, 'id as 'b) registerTempTable "t"
> {code}
> The following SQL statement can't pass analysis:
> {noformat}
> scala> sqlContext sql "SELECT * FROM t GROUP BY a HAVING COUNT(b) > 0 ORDER BY COUNT(b)" show ()
> org.apache.spark.sql.AnalysisException: expression '`t`.`b`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;
>   at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:36)
>   at org.apache.spark.sql.Dataset$.newDataFrame(Dataset.scala:58)
>   at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:784)
>   ... 49 elided
> {noformat}
> The reason is that analysis rule {{ResolveAggregateFunctions}} only handles the first {{Filter}} _or_ {{Sort}} directly above an {{Aggregate}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org