You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ankit Raj Boudh (Jira)" <ji...@apache.org> on 2019/12/10 04:42:00 UTC

[jira] [Comment Edited] (SPARK-30130) Hardcoded numeric values in common table expressions which utilize GROUP BY are interpreted as ordinal positions

    [ https://issues.apache.org/jira/browse/SPARK-30130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16992185#comment-16992185 ] 

Ankit Raj Boudh edited comment on SPARK-30130 at 12/10/19 4:41 AM:
-------------------------------------------------------------------

hi Matt boegner, can you please help me to reproduce this issue


was (Author: ankitraj):
hi Matt boegner, can you please me to reproduce this issue

> Hardcoded numeric values in common table expressions which utilize GROUP BY are interpreted as ordinal positions
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-30130
>                 URL: https://issues.apache.org/jira/browse/SPARK-30130
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.4
>            Reporter: Matt Boegner
>            Priority: Minor
>
> Hardcoded numeric values in common table expressions which utilize GROUP BY are interpreted as ordinal positions. 
> {code:java}
> val df = spark.sql("""
>  with a as (select 0 as test, count group by test)
>  select * from a
>  """)
>  df.show(){code}
>  This results in an error message like {color:#e01e5a}GROUP BY position 0 is not in select list (valid range is [1, 2]){color} .
>  
> However, this error does not appear in a traditional subselect format. For example, this query executes correctly:
> {code:java}
> val df = spark.sql("""
>  select * from (select 0 as test, count group by test) a
>  """)
>  df.show(){code}
>   



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org