You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/09/21 18:53:20 UTC

[jira] [Resolved] (SPARK-17616) Getting "java.lang.RuntimeException: Distinct columns cannot exist in Aggregate "

     [ https://issues.apache.org/jira/browse/SPARK-17616?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen resolved SPARK-17616.
--------------------------------
    Resolution: Duplicate

> Getting "java.lang.RuntimeException: Distinct columns cannot exist in Aggregate "
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-17616
>                 URL: https://issues.apache.org/jira/browse/SPARK-17616
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Egor Pahomov
>            Priority: Minor
>
> I execute:
> {code}
> select platform, 
>         collect_set(user_auth) as paid_types,
>         count(distinct sessionid) as sessions
>     from non_hss.session
>     where
>         event = 'stop' and platform != 'testplatform' and
>         not (month = MONTH(current_date()) AND year = YEAR(current_date()) and day = day(current_date())) and
>         (
>             (month >= MONTH(add_months(CURRENT_DATE(), -5)) AND year = YEAR(add_months(CURRENT_DATE(), -5)))
>             OR
>             (month <= MONTH(add_months(CURRENT_DATE(), -5)) AND year > YEAR(add_months(CURRENT_DATE(), -5)))
>         )
>     group by platform
> {code}
> I get:
> {code}
> java.lang.RuntimeException: Distinct columns cannot exist in Aggregate operator containing aggregate functions which don't support partial aggregation.
> {code}
> IT WORKED IN 1.6.2. I've read error 5 times, and read code once. I still don't understand what I do incorrectly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org