You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Herman van Hovell (JIRA)" <ji...@apache.org> on 2016/01/18 20:07:39 UTC

[jira] [Commented] (SPARK-12880) Different results on groupBy after window function

    [ https://issues.apache.org/jira/browse/SPARK-12880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15105678#comment-15105678 ] 

Herman van Hovell commented on SPARK-12880:
-------------------------------------------

You are ordering by {{YYYYMM}} and it looks like there is not an unique ordering (for instance 200509). I don't think it is a given that sorting will always yield the same ordering in such a case. As a result calling the {{lag()}} function will not return the same results.

Please check if the {{YYYYMM}} values are unique for each {{product, bnd & age}} tuple.

> Different results on groupBy after window function
> --------------------------------------------------
>
>                 Key: SPARK-12880
>                 URL: https://issues.apache.org/jira/browse/SPARK-12880
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.0
>            Reporter: Saif Addin Ellafi
>            Priority: Critical
>
> {noformat}
> scala> val overVint = Window.partitionBy("product", "bnd", "age").orderBy(asc("yyyymm"))
> scala> val df_data2 = df_data.withColumn("result", lag("baleom", 1).over(overVint))
> scala> df_data2.filter("product = 'MAIN' and bnd = 'High' and yyyymm = 200509").groupBy("yyyymm", "closed", "ever_closed").agg(sum("result").as("result")).show
> +------+------+-----------+--------------------+
> |yyyymm|closed|ever_closed|              result|
> +------+------+-----------+--------------------+
> |200509|     1|          1|1.2672666129980398E7|
> |200509|     0|          0|2.7104834668856387E9|
> |200509|     0|          1| 1.151339011298214E8|
> +------+------+-----------+--------------------+
> scala> df_data2.filter("product = 'MAIN' and bnd = 'High' and yyyymm = 200509").groupBy("yyyymm", "closed", "ever_closed").agg(sum("result").as("result")).show
> +------+------+-----------+--------------------+
> |yyyymm|closed|ever_closed|              result|
> +------+------+-----------+--------------------+
> |200509|     1|          1|1.2357681589980595E7|
> |200509|     0|          0| 2.709930867575646E9|
> |200509|     0|          1|1.1595048973981345E8|
> +------+------+-----------+--------------------+
> {noformat}
> Does NOT happen with columns not of the window function
> Happens both in cluster mode and local mode
> Before group by operation, data looks good and is consistent



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org