You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Praveen <pr...@gmail.com> on 2017/09/01 04:52:20 UTC

Writing streams to kafka topic

Hi,

I have a use case where I want to schedule processing of events in the
future. I am not really sure if this a proper use of stream processing
application. But I was looking at KTable and kafka streams api to see if
this was possible.

So far the pattern I have is:
    FEED -> changelog stream -> groupByKey() -> window -> write to
different kafka topic

The window here i believe would be the TumblingWindow for my use case. I'd
like to write back to a kafka topic only after the window retention ends.
The documentation
<http://docs.confluent.io/current/streams/developer-guide.html#writing-streams-back-to-kafka>
says that streams may only be written "continuously" to the kafka topic. Is
that the case?

- Praveen

Re: Writing streams to kafka topic

Posted by Praveen <pr...@gmail.com>.
Thanks. I was able to quickly build a simple example out of this. Also saw
the issue with punctuate and your “tick” feed recommendation for now.

- Praveen

On Fri, Sep 1, 2017 at 9:48 AM, Matthias J. Sax <ma...@confluent.io>
wrote:

> Hi,
>
> this is not supported by the DSL layer. What you would need to do, is to
> add a custom stateful transform() operator after there window
> (`stream.groupByKey().aggregate().toStream().transform().to()`), that
> buffers the output and remembers the latest result. Second, you would
> schedule a punctuation that emit the data whenever you want.
>
> Hope this helps.
>
>
> -Matthias
>
> On 8/31/17 9:52 PM, Praveen wrote:
> > Hi,
> >
> > I have a use case where I want to schedule processing of events in the
> > future. I am not really sure if this a proper use of stream processing
> > application. But I was looking at KTable and kafka streams api to see if
> > this was possible.
> >
> > So far the pattern I have is:
> >     FEED -> changelog stream -> groupByKey() -> window -> write to
> > different kafka topic
> >
> > The window here i believe would be the TumblingWindow for my use case.
> I'd
> > like to write back to a kafka topic only after the window retention ends.
> > The documentation
> > <http://docs.confluent.io/current/streams/developer-
> guide.html#writing-streams-back-to-kafka>
> > says that streams may only be written "continuously" to the kafka topic.
> Is
> > that the case?
> >
> > - Praveen
> >
>
>

Re: Writing streams to kafka topic

Posted by "Matthias J. Sax" <ma...@confluent.io>.
Hi,

this is not supported by the DSL layer. What you would need to do, is to
add a custom stateful transform() operator after there window
(`stream.groupByKey().aggregate().toStream().transform().to()`), that
buffers the output and remembers the latest result. Second, you would
schedule a punctuation that emit the data whenever you want.

Hope this helps.


-Matthias

On 8/31/17 9:52 PM, Praveen wrote:
> Hi,
> 
> I have a use case where I want to schedule processing of events in the
> future. I am not really sure if this a proper use of stream processing
> application. But I was looking at KTable and kafka streams api to see if
> this was possible.
> 
> So far the pattern I have is:
>     FEED -> changelog stream -> groupByKey() -> window -> write to
> different kafka topic
> 
> The window here i believe would be the TumblingWindow for my use case. I'd
> like to write back to a kafka topic only after the window retention ends.
> The documentation
> <http://docs.confluent.io/current/streams/developer-guide.html#writing-streams-back-to-kafka>
> says that streams may only be written "continuously" to the kafka topic. Is
> that the case?
> 
> - Praveen
>