You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "XiDuo You (Jira)" <ji...@apache.org> on 2022/02/11 07:01:00 UTC

[jira] [Updated] (SPARK-38185) Fix data incorrect if aggregate is group only with empty function

     [ https://issues.apache.org/jira/browse/SPARK-38185?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

XiDuo You updated SPARK-38185:
------------------------------
    Description: 
The group only condition should check if the aggregate expression is empty.

In DataFrame api, it is allowed to make a empty aggregations.

So the following query should return 1 rather than 0 because it's a global aggregate.
{code:java}
val emptyAgg = Map.empty[String, String]
spark.range(2).where("id > 2").agg(emptyAgg).limit(1).count
{code}


  was:
The group only condition should check if the aggregate expression is empty.

In DataFrame api, it is allowed to make a empty aggregations.

So the following query should return 1 rather than 0 because it's a global aggregate.
{code:java}
spark.range(2).where("id > 2").agg(emptyAgg).limit(1).count
{code}



> Fix data incorrect if aggregate is group only with empty function
> -----------------------------------------------------------------
>
>                 Key: SPARK-38185
>                 URL: https://issues.apache.org/jira/browse/SPARK-38185
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.1, 3.3.0
>            Reporter: XiDuo You
>            Priority: Major
>
> The group only condition should check if the aggregate expression is empty.
> In DataFrame api, it is allowed to make a empty aggregations.
> So the following query should return 1 rather than 0 because it's a global aggregate.
> {code:java}
> val emptyAgg = Map.empty[String, String]
> spark.range(2).where("id > 2").agg(emptyAgg).limit(1).count
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org