You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Flavio Pompermaier <po...@okkam.it> on 2020/07/16 07:36:44 UTC

Accumulators in Table API

Hi to all,
in my legacy code (using Dataset api) I used to add a map function just
after the Source read and keep the count of the rows. In this way I had a
very light and unobtrusive way of counting the rows of a dataset. Can I do
something similar in table API? Is there a way to use accumulators?

Thanks in advance,
Flavio

Re: Accumulators in Table API

Posted by Dawid Wysakowicz <dw...@apache.org>.
Hi Flavio.

You don't have access to accumulators in Table API.

A few other ways that come to my mind are:

1. Use existing metrics e.g. operator input/output records.

2. Use metrics in a UDF

3. Have a regular count (you can have multiple queries optimized into a
single graph via TableEnvironment#createStatementSet)

Best,

Dawid

On 16/07/2020 09:36, Flavio Pompermaier wrote:
> Hi to all,
> in my legacy code (using Dataset api) I used to add a map function
> just after the Source read and keep the count of the rows. In this way
> I had a very light and unobtrusive way of counting the rows of a
> dataset. Can I do something similar in table API? Is there a way to
> use accumulators?
>
> Thanks in advance,
> Flavio