You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dovi Joel (Jira)" <ji...@apache.org> on 2022/10/20 13:11:00 UTC

[jira] [Commented] (SPARK-40176) Enhance collapse window optimization to work in case partition or order by keys are expressions

    [ https://issues.apache.org/jira/browse/SPARK-40176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17621100#comment-17621100 ] 

Dovi Joel commented on SPARK-40176:
-----------------------------------

Is this related to this issue? https://issues.apache.org/jira/browse/SPARK-30552 

> Enhance collapse window optimization to work in case partition or order by keys are expressions
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-40176
>                 URL: https://issues.apache.org/jira/browse/SPARK-40176
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.2.0, 3.2.1, 3.3.0
>            Reporter: Ayushi Agarwal
>            Priority: Major
>
> In window operator with multiple window functions, if any expression is present in partition by or sort order columns, windows are not collapsed even if partition and order by expression is same for all those window functions.
> E.g. query:
> val w = Window.{_}partitionBy{_}("key").orderBy({_}lower{_}({_}col{_}("value")))
> df.select({_}lead{_}("key", 1).over(w), {_}lead{_}("value", 1).over(w))
> Current Plan:
> -Window(lead(value,1), key, _w1) -------------- W1
> - Sort (key, _w1)
> -Project (lower(“value”) as _w1) --------- P1
> -Window(lead(key,1), key, _w0) ---------------- W2
> -Sort(key, _w0)
> -Exchange(key)
> -Project (lower(“value”) as _w0) ---- P2
> -Scan
>  
> W1 and W2 can be merged in single window



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org