You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Roy Cecil (JIRA)" <ji...@apache.org> on 2016/03/30 15:44:25 UTC
[jira] [Commented] (SPARK-14203) TPC-DS Query 86 - When there is a
rollup , the parser keels over on grouping_id() on a single column
[ https://issues.apache.org/jira/browse/SPARK-14203?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15217979#comment-15217979 ]
Roy Cecil commented on SPARK-14203:
-----------------------------------
Closing this as grouping((i_category) + grouping(i_class)
...
grouping(i_class) works..
> TPC-DS Query 86 - When there is a rollup , the parser keels over on grouping_id() on a single column
> -----------------------------------------------------------------------------------------------------
>
> Key: SPARK-14203
> URL: https://issues.apache.org/jira/browse/SPARK-14203
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.6.1
> Environment: Red Hat Enterprise Linux Server release 7.1 (Maipo)
> Linux bigaperf171.svl.ibm.com 3.10.0-229.el7.x86_64 #1 SMP Thu Jan 29 18:37:38 EST 2015 x86_64 x86_64 x86_64 GNU/Linux
> Reporter: Roy Cecil
>
> Basically error is from 2.0.
> select
> sum(ws_net_paid) as total_sum
> ,i_category
> ,i_class
> ,grouping_id(i_category,i_class) as lochierarchy
> ,rank() over (
> partition by grouping_id(i_category,i_class)
> , case when grouping_id(i_class) = 0 then i_category end
> order by sum(ws_net_paid) desc) as rank_within_parent
> from
> web_sales
> ,date_dim d1
> ,item
> where
> d1.d_month_seq between 1200 and 1200+11
> and d1.d_date_sk = ws_sold_date_sk
> and i_item_sk = ws_item_sk
> group by rollup(i_category,i_class)
> order by
> lochierarchy desc,
> case when lochierarchy = 0 then i_category end,
> rank_within_parent
> limit 100
> 16/03/28 08:45:32 INFO parser.ParseDriver: Parse completed.
> Error in query: Columns of grouping_id (i_class#170) does not match grouping columns (i_category#172,i_class#170);
> There is no error in the query below though spark complains there is one. In fact it is perfectly fine if you need to pick out one grouping in a grouping set to refer within your query.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org