You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/09/21 22:24:20 UTC
[jira] [Commented] (SPARK-17616) Getting
"java.lang.RuntimeException: Distinct columns cannot exist in Aggregate "
[ https://issues.apache.org/jira/browse/SPARK-17616?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15511386#comment-15511386 ]
Apache Spark commented on SPARK-17616:
--------------------------------------
User 'hvanhovell' has created a pull request for this issue:
https://github.com/apache/spark/pull/15187
> Getting "java.lang.RuntimeException: Distinct columns cannot exist in Aggregate "
> ---------------------------------------------------------------------------------
>
> Key: SPARK-17616
> URL: https://issues.apache.org/jira/browse/SPARK-17616
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Egor Pahomov
> Priority: Minor
>
> I execute:
> {code}
> select platform,
> collect_set(user_auth) as paid_types,
> count(distinct sessionid) as sessions
> from non_hss.session
> where
> event = 'stop' and platform != 'testplatform' and
> not (month = MONTH(current_date()) AND year = YEAR(current_date()) and day = day(current_date())) and
> (
> (month >= MONTH(add_months(CURRENT_DATE(), -5)) AND year = YEAR(add_months(CURRENT_DATE(), -5)))
> OR
> (month <= MONTH(add_months(CURRENT_DATE(), -5)) AND year > YEAR(add_months(CURRENT_DATE(), -5)))
> )
> group by platform
> {code}
> I get:
> {code}
> java.lang.RuntimeException: Distinct columns cannot exist in Aggregate operator containing aggregate functions which don't support partial aggregation.
> {code}
> IT WORKED IN 1.6.2. I've read error 5 times, and read code once. I still don't understand what I do incorrectly.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org