You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liuxian (JIRA)" <ji...@apache.org> on 2017/07/31 04:09:00 UTC
[jira] [Updated] (SPARK-21580) There's a bug with `Group by
ordinal`
[ https://issues.apache.org/jira/browse/SPARK-21580?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
liuxian updated SPARK-21580:
----------------------------
Description:
create temporary view data as select * from values
(1, 1),
(1, 2),
(2, 1),
(2, 2),
(3, 1),
(3, 2)
as data(a, b);
*select 3, 4, sum(b) from data group by 1, 2;*
When running this case, the following exception occurred:
Error in query: GROUP BY position 4 is not in select list (valid range is [1, 3]); line 1 pos 10
But this case can run normally:*select 1, 2, sum(b) from data group by 1, 2;*
was:
create temporary view data as select * from values
(1, 1),
(1, 2),
(2, 1),
(2, 2),
(3, 1),
(3, 2)
as data(a, b);
select 3, 4, sum(b) from data group by 1, 2;
When running this case, the following exception occurred:
Error in query: GROUP BY position 4 is not in select list (valid range is [1, 3]); line 1 pos 10
But this case can run normally:`select 1, 2, sum(b) from data group by 1, 2;`
> There's a bug with `Group by ordinal`
> --------------------------------------
>
> Key: SPARK-21580
> URL: https://issues.apache.org/jira/browse/SPARK-21580
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.0
> Reporter: liuxian
>
> create temporary view data as select * from values
> (1, 1),
> (1, 2),
> (2, 1),
> (2, 2),
> (3, 1),
> (3, 2)
> as data(a, b);
> *select 3, 4, sum(b) from data group by 1, 2;*
> When running this case, the following exception occurred:
> Error in query: GROUP BY position 4 is not in select list (valid range is [1, 3]); line 1 pos 10
> But this case can run normally:*select 1, 2, sum(b) from data group by 1, 2;*
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org