You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/07/26 04:00:23 UTC

[jira] [Resolved] (SPARK-16642) ResolveWindowFrame should not be triggered on UnresolvedFunctions.

     [ https://issues.apache.org/jira/browse/SPARK-16642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yin Huai resolved SPARK-16642.
------------------------------
       Resolution: Fixed
    Fix Version/s: 2.1.0
                   2.0.1

Issue resolved by pull request 14284
[https://github.com/apache/spark/pull/14284]

> ResolveWindowFrame should not be triggered on UnresolvedFunctions.
> ------------------------------------------------------------------
>
>                 Key: SPARK-16642
>                 URL: https://issues.apache.org/jira/browse/SPARK-16642
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>             Fix For: 2.0.1, 2.1.0
>
>
> The case at https://github.com/apache/spark/blob/75146be6ba5e9f559f5f15430310bb476ee0812c/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala#L1790-L1792 is shown below
> {code}
> case we @ WindowExpression(e, s @ WindowSpecDefinition(_, o, UnspecifiedFrame)) =>
>           val frame = SpecifiedWindowFrame.defaultWindowFrame(o.nonEmpty, acceptWindowFrame = true)
>           we.copy(windowSpec = s.copy(frameSpecification = frame))
> {code}
> This case will be triggered even when the function is an unresolved. So, when the functions like lead are used, we may see errors like {{Window Frame RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW must match the required frame ROWS BETWEEN 1 FOLLOWING AND 1 FOLLOWING.}} because we wrongly set the the frame specification.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org