You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Teena K <av...@gmail.com> on 2018/01/02 05:14:53 UTC

Flink Kafka Consumer stops fetching records

Hi,

I am using Flink 1.4 along with Kafka 0.11. My stream job has 4 Kafka
consumers each subscribing to 4 different topics. The stream from each
consumer gets processed in 3 to 4 different ways there by writing to a
total of 12 sinks (cassandra tables). When the job runs, up to 8 or 10
records get processed correctly and after that they are not subscribed by
the consumers. I have tried this with 'flink 1.3.2 and kafka 0.10' and
'flink 1.4 and kafka 0.10' all of which gave the same results.

Re: Flink Kafka Consumer stops fetching records

Posted by Timo Walther <tw...@apache.org>.
Hi Teena,

could you tell us a bit more about your job. Are you using event-time 
semantics?

Regards,
Timo

Am 1/2/18 um 6:14 AM schrieb Teena K:
> Hi,
>
> I am using Flink 1.4 along with Kafka 0.11. My stream job has 4 Kafka 
> consumers each subscribing to 4 different topics. The stream from each 
> consumer gets processed in 3 to 4 different ways there by writing to a 
> total of 12 sinks (cassandra tables). When the job runs, up to 8 or 10 
> records get processed correctly and after that they are not subscribed 
> by the consumers. I have tried this with 'flink 1.3.2 and kafka 0.10' 
> and 'flink 1.4 and kafka 0.10' all of which gave the same results.