You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/08/14 04:15:20 UTC

[jira] [Commented] (SPARK-17049) LAG function fails when selecting all columns

    [ https://issues.apache.org/jira/browse/SPARK-17049?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15420207#comment-15420207 ] 

Dongjoon Hyun commented on SPARK-17049:
---------------------------------------

Hi, [~gcivan].
Yes. On Spark 2.0, it fails. Fortunately, the bug seems to be fixed already.
{code}
scala> sql("create table a as select 1 as col")
scala> sql("select *, lag(col) over (order by col) as prev from a")
scala> sql("select *, lag(col) over (order by col) as prev from a").show()
+---+----+
|col|prev|
+---+----+
|  1|null|
+---+----+

scala> spark.version
res3: String = 2.1.0-SNAPSHOT
{code}

> LAG function fails when selecting all columns
> ---------------------------------------------
>
>                 Key: SPARK-17049
>                 URL: https://issues.apache.org/jira/browse/SPARK-17049
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Gokhan Civan
>
> In version 1.6.1, the queries
> create table a as select 1 as col;
> select *, lag(col) over (order by col) as prev from a;
> successfully produce the table
> col  prev
> 1    null
> However, in version 2.0.0, this fails with the error
> org.apache.spark.sql.AnalysisException: Window Frame RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW must match the required frame ROWS BETWEEN 1 PRECEDING AND 1 PRECEDING;
>         at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:40)
>         at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:58)
>         at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveWindowFrame$$anonfun$apply$29$$anonfun$applyOrElse$10.applyOrElse(Analyzer.scala:1785)
>         at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveWindowFrame$$anonfun$apply$29$$anonfun$applyOrElse$10.applyOrElse(Analyzer.scala:1781)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:279)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:279)
>         at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69)
>         at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:278)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:284)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:284)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:321)
>         at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:179)
>         at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:319)
>         at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:284)
>         at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionDown$1(QueryPlan.scala:156)
>         at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:166)
>         at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1$1.apply(QueryPlan.scala:170)
> ...
> On the other hand, the query works if * is replaced with col as in
> select col, lag(col) over (order by col) as prev from a;
> It also works as follows:
> select col, lag(col) over (order by col ROWS BETWEEN 1 PRECEDING AND 1 PRECEDING) as prev from a;



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org