You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Fritz Budiyanto <fb...@icloud.com> on 2016/12/02 19:20:40 UTC

Flink event session window in Spark

Hi All,

I need help on how to implement Flink event session window in Spark. Is this possible?

For instance, I wanted to create a session window with a timeout of 10 minutes (see Flink snippet below)
Continues event will make the session window alive. If there are no activity for 10 minutes, the session window shall close and forward the data to a sink function.

// event-time session windows
input                                                                           
    .keyBy(<key selector>)                                                      
    .window(EventTimeSessionWindows.withGap(Time.minutes(10)))                  
    .<windowed transformation>(<window function>);                              
                                                                                
Any idea ?

Thanks,
Fritz

Re: Flink event session window in Spark

Posted by Miguel Morales <th...@gmail.com>.
Although this may not be natively supported, you can mimic this behavior.
By having a micro batch time of 1 minute.  Then on your
updateStateByKey check how long the session has been running.  If it's
longer than 10 minutes, return an empty key so that it's removed from
the stream.

On Fri, Dec 2, 2016 at 12:13 PM, Michael Armbrust
<mi...@databricks.com> wrote:
> Here is the JIRA for adding this feature:
> https://issues.apache.org/jira/browse/SPARK-10816
>
> On Fri, Dec 2, 2016 at 11:20 AM, Fritz Budiyanto <fb...@icloud.com>
> wrote:
>>
>> Hi All,
>>
>> I need help on how to implement Flink event session window in Spark. Is
>> this possible?
>>
>> For instance, I wanted to create a session window with a timeout of 10
>> minutes (see Flink snippet below)
>> Continues event will make the session window alive. If there are no
>> activity for 10 minutes, the session window shall close and forward the data
>> to a sink function.
>>
>> // event-time session windows
>> input
>>     .keyBy(<key selector>)
>>     .window(EventTimeSessionWindows.withGap(Time.minutes(10)))
>>     .<windowed transformation>(<window function>);
>>
>>
>>
>> Any idea ?
>>
>> Thanks,
>> Fritz
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Flink event session window in Spark

Posted by Michael Armbrust <mi...@databricks.com>.
Here is the JIRA for adding this feature:
https://issues.apache.org/jira/browse/SPARK-10816

On Fri, Dec 2, 2016 at 11:20 AM, Fritz Budiyanto <fb...@icloud.com>
wrote:

> Hi All,
>
> I need help on how to implement Flink event session window in Spark. Is
> this possible?
>
> For instance, I wanted to create a session window with a timeout of 10
> minutes (see Flink snippet below)
> Continues event will make the session window alive. If there are no
> activity for 10 minutes, the session window shall close and forward the
> data to a sink function.
>
> // event-time session windows
> input
>
>     .keyBy(<key selector>)
>
>     .window(EventTimeSessionWindows.withGap(Time.minutes(10)))
>
>     .<windowed transformation>(<window function>);
>
>
>
>
> Any idea ?
>
> Thanks,
> Fritz
>