You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Upendra Yadav <up...@gmail.com> on 2019/11/13 05:11:24 UTC

best config for kafka 10.0.0.1 consumer.assign.

Hi,

I m using consumer assign method and consume with 15000 poll time out to
consume single partition data from another DC.

Below are my consumer configs:
enable.auto.commit=false
max.poll.records=4000
max.partition.fetch.bytes=4096000
key.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
value.deserializer=org.apache.kafka.common.serialization
.ByteArrayDeserializer

with this my consumer works fine. but when I'm changing
max.partition.fetch.bytes to 16384000, my consumer is not receiving any
message.
there is no exception. if I'm using consumer assign, do I need to tune
below properties:
fetch.max.bytes
session.timeout.ms
heartbeat.interval.ms
Please let me know if I'm missing something.

Re: best config for kafka 10.0.0.1 consumer.assign.

Posted by Upendra Yadav <up...@gmail.com>.
I have added this to my consumer config, and now it works fine.
receive.buffer.bytes=1048576



On Wed, Nov 13, 2019 at 10:41 AM Upendra Yadav <up...@gmail.com>
wrote:

> Hi,
>
> I m using consumer assign method and consume with 15000 poll time out to
> consume single partition data from another DC.
>
> Below are my consumer configs:
> enable.auto.commit=false
> max.poll.records=4000
> max.partition.fetch.bytes=4096000
> key.deserializer=org.apache.kafka.common.serialization
> .ByteArrayDeserializer value.deserializer=org.apache.kafka.common.
> serialization.ByteArrayDeserializer
>
> with this my consumer works fine. but when I'm changing
> max.partition.fetch.bytes to 16384000, my consumer is not receiving any
> message.
> there is no exception. if I'm using consumer assign, do I need to tune
> below properties:
> fetch.max.bytes
> session.timeout.ms
> heartbeat.interval.ms
> Please let me know if I'm missing something.
>