You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 03:59:29 UTC

[jira] [Updated] (SPARK-20828) Concatenated grouping sets scenario not supported

     [ https://issues.apache.org/jira/browse/SPARK-20828?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-20828:
---------------------------------
    Labels: bulk-closed  (was: )

> Concatenated grouping sets scenario not supported 
> --------------------------------------------------
>
>                 Key: SPARK-20828
>                 URL: https://issues.apache.org/jira/browse/SPARK-20828
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: N Campbell
>            Priority: Major
>              Labels: bulk-closed
>
> Following scenario supported by other vendors (i.e. ORACLE, DB2, ...) not supported by SPARK SQL
>  WITH 
> SQL1 AS 
>     (
>     SELECT
>         sno AS C1, 
>         pno AS C2, 
>         SUM(qty) AS C3
>     FROM
>         cert.tsupply 
>     GROUP BY 
>         ROLLUP(sno), 
>         CUBE(pno)
>     )
> SELECT
>     SQL1.C1 AS C1, 
>     SQL1.C2 AS C2, 
>     SQL1.C3 AS C3
> FROM
>     SQL1
> Error: [Simba][SparkJDBCDriver](500051) ERROR processing query/statement. Error Code: ERROR_STATE, SQL state: org.apache.spark.sql.AnalysisException: expression 'tsupply.`sno`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;;
> 'Project ['SQL1.C1 AS C1#1517671, 'SQL1.C2 AS C2#1517672, 'SQL1.C3 AS C3#1517673]
> +- 'SubqueryAlias SQL1
>    +- 'Aggregate [rollup(sno#1517678), cube(pno#1517679)], [sno#1517678 AS C1#1517674, pno#1517679 AS C2#1517675, sum(cast(qty#1517681 as bigint)) AS C3#1517676L]
>       +- MetastoreRelation cert, tsupply



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org