You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Herman van Hovell (JIRA)" <ji...@apache.org> on 2016/04/20 21:15:25 UTC
[jira] [Commented] (SPARK-14763) Can't analyze TPCDS Q70
[ https://issues.apache.org/jira/browse/SPARK-14763?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15250530#comment-15250530 ]
Herman van Hovell commented on SPARK-14763:
-------------------------------------------
[~davies] are you fixing this one?
> Can't analyze TPCDS Q70
> -----------------------
>
> Key: SPARK-14763
> URL: https://issues.apache.org/jira/browse/SPARK-14763
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Davies Liu
> Assignee: Davies Liu
>
> {code}
> select
> sum(ss_net_profit) as total_sum, s_state, s_county
> ,grouping(s_state)+grouping(s_county) as lochierarchy
> ,rank() over (
> partition by grouping(s_state)+grouping(s_county),
> case when grouping(s_county) = 0 then s_state end
> order by sum(ss_net_profit) desc) as rank_within_parent
> from
> store_sales, date_dim d1, store
> where
> d1.d_month_seq between 1200 and 1200+11
> and d1.d_date_sk = ss_sold_date_sk
> and s_store_sk = ss_store_sk
> and s_state in
> (select s_state from
> (select s_state as s_state,
> rank() over ( partition by s_state order by sum(ss_net_profit) desc) as ranking
> from store_sales, store, date_dim
> where d_month_seq between 1200 and 1200+11
> and d_date_sk = ss_sold_date_sk
> and s_store_sk = ss_store_sk
> group by s_state) tmp1
> where ranking <= 5)
> group by rollup(s_state,s_county)
> order by
> lochierarchy desc
> ,case when lochierarchy = 0 then s_state end
> ,rank_within_parent
> limit 100
> {code}
> Error
> {code}
> ERR u'Accessing outer query column is not allowed in an AGGREATE: d_month_seq#158;'
> {code}
> The column d_month_seq inside subquery does not reference to outer table, it's not resolved correctly.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org