You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/10/13 11:23:00 UTC

[jira] [Commented] (SPARK-33131) Fix grouping sets with having clause can not resolve qualified col name

    [ https://issues.apache.org/jira/browse/SPARK-33131?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17213068#comment-17213068 ] 

Apache Spark commented on SPARK-33131:
--------------------------------------

User 'ulysses-you' has created a pull request for this issue:
https://github.com/apache/spark/pull/30029

> Fix grouping sets with having clause can not resolve qualified col name
> -----------------------------------------------------------------------
>
>                 Key: SPARK-33131
>                 URL: https://issues.apache.org/jira/browse/SPARK-33131
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: ulysses you
>            Priority: Minor
>
> The method `ResolveAggregateFunctions.resolveFilterCondInAggregate` aims to do the two things
> 1. resolve the expression in having.
> 2. push the having extra agg expression to `Aggregate`
> However we only care about 2 now. If having clause resolution is successful but not exists extra agg expression, we will ignore the resolution. Here is a example:
> {code:java}
> -- Works resolved by ResolveReferences
> select c1 from values (1) as t1(c1) group by grouping sets(t1.c1) having c1 = 1
> -- Works because of the extra expression c1
> select c1 as c2 from values (1) as t1(c1) group by grouping sets(t1.c1) having t1.c1 = 1
> -- Failed
> select c1 from values (1) as t1(c1) group by grouping sets(t1.c1) having t1.c1 = 1{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org