You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2016/12/14 17:53:58 UTC

[jira] [Commented] (SPARK-18863) Output non-aggregate expressions without GROUP BY in a subquery does not yield an error

    [ https://issues.apache.org/jira/browse/SPARK-18863?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15748975#comment-15748975 ] 

Xiao Li commented on SPARK-18863:
---------------------------------

The JIRA only shows one scenario. The expected error messages could be either
`org.apache.spark.sql.AnalysisException: grouping expressions sequence is empty, and 'c1.`cv`' is not an aggregate function. `
or
`org.apache.spark.sql.AnalysisException: expression 'c1.`cv`' is neither present in the group by, nor is it an aggregate function.`

> Output non-aggregate expressions without GROUP BY in a subquery does not yield an error 
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-18863
>                 URL: https://issues.apache.org/jira/browse/SPARK-18863
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Nattavut Sutyanyong
>            Priority: Minor
>
> [~smilegator] has found that the following query does not raise a syntax error (note the GROUP BY clause is commented out):
> {code:sql}
> SELECT pk, cv
> FROM   p, c
> WHERE  p.pk = c.ck
> AND    c.cv = (SELECT max(avg)
>                FROM   (SELECT   c1.cv, avg(c1.cv) avg
>                        FROM     c c1
>                        WHERE    c1.ck = p.pk
> --                       GROUP BY c1.cv
>                       ))
> {code}
> There could be multiple values of {{c1.cv}} for each value of {{avg(c1.cv)}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org