You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ruifeng Zheng (Jira)" <ji...@apache.org> on 2022/12/30 08:35:00 UTC

[jira] [Assigned] (SPARK-41758) Support Window.rowsBetween

     [ https://issues.apache.org/jira/browse/SPARK-41758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ruifeng Zheng reassigned SPARK-41758:
-------------------------------------

    Assignee: Ruifeng Zheng

> Support Window.rowsBetween
> --------------------------
>
>                 Key: SPARK-41758
>                 URL: https://issues.apache.org/jira/browse/SPARK-41758
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Connect
>    Affects Versions: 3.4.0
>            Reporter: Sandeep Singh
>            Assignee: Ruifeng Zheng
>            Priority: Major
>
> Doctest in pyspark.sql.connect.window.Window.rowsBetween fails with the error below:
> {code:java}
> File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/column.py", line 324, in pyspark.sql.connect.column.Column.over
> Failed example:
>     window = Window.partitionBy("name").orderBy("age")                 .rowsBetween(Window.unboundedPreceding, Window.currentRow)
> Exception raised:
>     Traceback (most recent call last):
>       File "/usr/local/Cellar/python@3.10/3.10.8/Frameworks/Python.framework/Versions/3.10/lib/python3.10/doctest.py", line 1350, in __run
>         exec(compile(example.source, filename, "single",
>       File "<doctest pyspark.sql.connect.column.Column.over[1]>", line 1, in <module>
>         window = Window.partitionBy("name").orderBy("age")                 .rowsBetween(Window.unboundedPreceding, Window.currentRow)
>       File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/utils.py", line 346, in wrapped
>         raise NotImplementedError()
>     NotImplementedError{code}
> We should enable this back after fixing the issue in Spark Connect



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org