You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/12/31 05:06:00 UTC

[jira] [Resolved] (SPARK-41773) Window.partitionBy is not respected with row_number

     [ https://issues.apache.org/jira/browse/SPARK-41773?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-41773.
----------------------------------
    Fix Version/s: 3.4.0
       Resolution: Fixed

Issue resolved by pull request 39318
[https://github.com/apache/spark/pull/39318]

> Window.partitionBy is not respected with row_number 
> ----------------------------------------------------
>
>                 Key: SPARK-41773
>                 URL: https://issues.apache.org/jira/browse/SPARK-41773
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Connect
>    Affects Versions: 3.4.0
>            Reporter: Hyukjin Kwon
>            Assignee: Ruifeng Zheng
>            Priority: Major
>             Fix For: 3.4.0
>
>
> {code}
> File "/.../spark/python/pyspark/sql/connect/window.py", line 292, in pyspark.sql.connect.window.Window.orderBy
> Failed example:
>     df.withColumn("row_number", row_number().over(window)).show()
> Expected:
>     +---+--------+----------+
>     | id|category|row_number|
>     +---+--------+----------+
>     |  1|       a|         1|
>     |  1|       a|         2|
>     |  1|       b|         3|
>     |  2|       a|         1|
>     |  2|       b|         2|
>     |  3|       b|         1|
>     +---+--------+----------+
> Got:
>     +---+--------+----------+
>     | id|category|row_number|
>     +---+--------+----------+
>     |  1|       b|         1|
>     |  1|       a|         2|
>     |  1|       a|         3|
>     |  2|       b|         1|
>     |  2|       a|         2|
>     |  3|       b|         1|
>     +---+--------+----------+
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org