You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Raviteja Lokineni (JIRA)" <ji...@apache.org> on 2016/10/13 10:38:20 UTC

[jira] [Commented] (SPARK-17893) Window functions should also allow looking back in time

    [ https://issues.apache.org/jira/browse/SPARK-17893?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15571556#comment-15571556 ] 

Raviteja Lokineni commented on SPARK-17893:
-------------------------------------------

[~srowen] I don't find anything in the documentation which offsets by timestamp or date.
* http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.expressions.Window
** Offsets by number of rows ahead or behind
** In my case I do not have a fixed number of rows between dates
* http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.functions$
** I looked at all four definitions of lead/lag in the docs but it offsets only by number of rows, which in my case is not possible

FYI, the function that I was referring to was: {noformat}def
window(timeColumn: Column, windowDuration: String): Column{noformat}

> Window functions should also allow looking back in time
> -------------------------------------------------------
>
>                 Key: SPARK-17893
>                 URL: https://issues.apache.org/jira/browse/SPARK-17893
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 2.0.1
>            Reporter: Raviteja Lokineni
>
> This function should allow looking back. The current window(timestamp, duration) seems to be for looking forward in time.
> Example:
> {code}dataFrame.groupBy(window("date", "7 days ago")).agg(min("col1"), max("col1")){code}
> For example, if date: 2013-01-07 then the window should be 2013-01-01 - 2013-01-07



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org