You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/10/28 06:42:00 UTC

[jira] [Resolved] (SPARK-32934) Improve the performance for NTH_VALUE and Reactor the OffsetWindowFunction

     [ https://issues.apache.org/jira/browse/SPARK-32934?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-32934.
---------------------------------
    Fix Version/s: 3.1.0
       Resolution: Fixed

Issue resolved by pull request 29800
[https://github.com/apache/spark/pull/29800]

> Improve the performance for NTH_VALUE and Reactor the OffsetWindowFunction
> --------------------------------------------------------------------------
>
>                 Key: SPARK-32934
>                 URL: https://issues.apache.org/jira/browse/SPARK-32934
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: jiaan.geng
>            Assignee: jiaan.geng
>            Priority: Major
>             Fix For: 3.1.0
>
>
> Spark SQL support some window function like  NTH_VALUE
> If we specify window frame like
> {code:java}
> UNBOUNDED PRECEDING AND CURRENT ROW
> {code}
> or
> {code:java}
> UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING
> {code}
> We can elimate some calculations.
> For example: if we execute the SQL show below:
> {code:java}
> SELECT NTH_VALUE(col,
>          2) OVER(ORDER BY rank UNBOUNDED PRECEDING
>         AND CURRENT ROW)
> FROM tab;
> {code}
> The output for row number greater than 1, return the fixed value. otherwise, return null. So we just calculate the value once and notice whether the row number less than 2.
> UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING is simpler.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org