You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Alessandro Tagliapietra <ta...@gmail.com> on 2019/09/16 19:11:00 UTC

Values not being aggregated

Hello everyone,

I've this code
https://gist.github.com/alex88/719383f38541c5324caf8f47b7239e15 (I've
omitted the store setup part) and i've this problem for a specific key.
Basically I can see the logs up until the "Pair Stream data". "Aggregate
stream data" line is never logger.

Since between these lines there's just the groupByKey and WindowedBy, are
there any logics in these two that could stop the flow of data? Since I
don't have any window closing mechanism or suppression shouldn't it just go
through?

Thank you in advance

--
Alessandro Tagliapietra

Re: Values not being aggregated

Posted by Alessandro Tagliapietra <ta...@gmail.com>.
Thanks a lot Bruno I'll check that!

--
Alessandro Tagliapietra

On Wed, Sep 18, 2019 at 4:20 PM Bruno Cadonna <br...@confluent.io> wrote:

> Hi Alessandro,
>
> If you want to get each update to an aggregate, you need to disable
> the cache. Otherwise, an update will only be emitted when the
> aggregate is evicted or flushed from the cache.
>
> To disable the cache, you can:
> - disable it with the `Materialized` object
> - set cache.max.bytes.buffering to zero which disables all caches in
> the topology
>
> Best,
> Bruno
>
> On Mon, Sep 16, 2019 at 12:11 PM Alessandro Tagliapietra
> <ta...@gmail.com> wrote:
> >
> > Hello everyone,
> >
> > I've this code
> > https://gist.github.com/alex88/719383f38541c5324caf8f47b7239e15 (I've
> > omitted the store setup part) and i've this problem for a specific key.
> > Basically I can see the logs up until the "Pair Stream data". "Aggregate
> > stream data" line is never logger.
> >
> > Since between these lines there's just the groupByKey and WindowedBy, are
> > there any logics in these two that could stop the flow of data? Since I
> > don't have any window closing mechanism or suppression shouldn't it just
> go
> > through?
> >
> > Thank you in advance
> >
> > --
> > Alessandro Tagliapietra
>

Re: Values not being aggregated

Posted by Bruno Cadonna <br...@confluent.io>.
Hi Alessandro,

If you want to get each update to an aggregate, you need to disable
the cache. Otherwise, an update will only be emitted when the
aggregate is evicted or flushed from the cache.

To disable the cache, you can:
- disable it with the `Materialized` object
- set cache.max.bytes.buffering to zero which disables all caches in
the topology

Best,
Bruno

On Mon, Sep 16, 2019 at 12:11 PM Alessandro Tagliapietra
<ta...@gmail.com> wrote:
>
> Hello everyone,
>
> I've this code
> https://gist.github.com/alex88/719383f38541c5324caf8f47b7239e15 (I've
> omitted the store setup part) and i've this problem for a specific key.
> Basically I can see the logs up until the "Pair Stream data". "Aggregate
> stream data" line is never logger.
>
> Since between these lines there's just the groupByKey and WindowedBy, are
> there any logics in these two that could stop the flow of data? Since I
> don't have any window closing mechanism or suppression shouldn't it just go
> through?
>
> Thank you in advance
>
> --
> Alessandro Tagliapietra