You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/06/13 03:07:01 UTC
[jira] [Resolved] (SPARK-28026) How to get the second row from 1
minute window
[ https://issues.apache.org/jira/browse/SPARK-28026?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-28026.
----------------------------------
Resolution: Invalid
> How to get the second row from 1 minute window
> ----------------------------------------------
>
> Key: SPARK-28026
> URL: https://issues.apache.org/jira/browse/SPARK-28026
> Project: Spark
> Issue Type: Question
> Components: Examples, Structured Streaming
> Affects Versions: 2.4.0
> Reporter: Devendra Vishwakarma
> Priority: Major
>
> I am almost blocked for a month I am still figuring out the API to achieve one of the functionalities related to spark structured streaming with window grouping. So I thought to reach you guys here.
> What I have is stock related time series data and I have grouped them in a 1-minute window along with the stock name. I am able to get first, last row in that 1-minute group, but I want some values from the second row of that 1 minute window, which I am not able to do at all. I looked at each function related to aggregation but I could not find any.Please help me.
> This is what I have done so far -
> val aggregates = stockEvents
> .withWatermark("timestamp", "5 seconds")
> .groupBy(window($"timestamp", "1 minute", "1 minute", "0 seconds"), $"stockName")
> .agg(
> first("tradingprice").alias("open"), //I have to make this value coming from second row
> last("tradingprice").alias("close"),
> max("tradingprice").alias("high"),
> min("tradingprice").alias("low"))
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org