You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by S Ahmed <sa...@gmail.com> on 2011/12/21 21:15:48 UTC

any message size restrictions? suggest range?

Was kafka designed for a specific message size range?

Seeing as it is used to aggregate log messages, is it safe to say message
sizes of 2-100K are reasonable and won't cause any issues?

Re: any message size restrictions? suggest range?

Posted by Neha Narkhede <ne...@gmail.com>.
Well, that depends on how much memory is available to your Kafka
consumer on the machine where it is running.

Thanks,
Neha

On Wed, Dec 21, 2011 at 1:23 PM, S Ahmed <sa...@gmail.com> wrote:
> What would be an upper bound then? i.e. 100K should be ok, what shouldn't?
> :)
>
> On Wed, Dec 21, 2011 at 4:16 PM, Neha Narkhede <ne...@gmail.com>wrote:
>
>> >> Was kafka designed for a specific message size range?
>>
>> Kafka consumer reads a message from the socket into memory. If a
>> message is large enough to cause OutOfMemoryException, then the Kafka
>> consumer is unable to return more messages from the socket byte
>> buffer. This can be fixed by enabling the Kafka consumers to have a
>> 'streaming' API, where such large messages could be read in a
>> piecemeal fashion. But it is tricky and we don't have that feature
>> yet.
>>
>> To avoid your Kafka consumer from getting into a bad state due to a
>> large message, you can set "max.message.size" to the largest possible
>> message size on your producer. Any message larger than that never
>> enters the Kafka cluster and hence never reaches a Kafka consumer.
>>
>> >> > Seeing as it is used to aggregate log messages, is it safe to say
>> message
>> > sizes of 2-100K are reasonable and won't cause any issues?
>>
>> 100K message sizes should work fine.
>>
>> Thanks,
>> Neha
>>
>> On Wed, Dec 21, 2011 at 12:15 PM, S Ahmed <sa...@gmail.com> wrote:
>> > Was kafka designed for a specific message size range?
>> >
>> > Seeing as it is used to aggregate log messages, is it safe to say message
>> > sizes of 2-100K are reasonable and won't cause any issues?
>>

Re: any message size restrictions? suggest range?

Posted by S Ahmed <sa...@gmail.com>.
What would be an upper bound then? i.e. 100K should be ok, what shouldn't?
:)

On Wed, Dec 21, 2011 at 4:16 PM, Neha Narkhede <ne...@gmail.com>wrote:

> >> Was kafka designed for a specific message size range?
>
> Kafka consumer reads a message from the socket into memory. If a
> message is large enough to cause OutOfMemoryException, then the Kafka
> consumer is unable to return more messages from the socket byte
> buffer. This can be fixed by enabling the Kafka consumers to have a
> 'streaming' API, where such large messages could be read in a
> piecemeal fashion. But it is tricky and we don't have that feature
> yet.
>
> To avoid your Kafka consumer from getting into a bad state due to a
> large message, you can set "max.message.size" to the largest possible
> message size on your producer. Any message larger than that never
> enters the Kafka cluster and hence never reaches a Kafka consumer.
>
> >> > Seeing as it is used to aggregate log messages, is it safe to say
> message
> > sizes of 2-100K are reasonable and won't cause any issues?
>
> 100K message sizes should work fine.
>
> Thanks,
> Neha
>
> On Wed, Dec 21, 2011 at 12:15 PM, S Ahmed <sa...@gmail.com> wrote:
> > Was kafka designed for a specific message size range?
> >
> > Seeing as it is used to aggregate log messages, is it safe to say message
> > sizes of 2-100K are reasonable and won't cause any issues?
>

Re: any message size restrictions? suggest range?

Posted by Neha Narkhede <ne...@gmail.com>.
>> Was kafka designed for a specific message size range?

Kafka consumer reads a message from the socket into memory. If a
message is large enough to cause OutOfMemoryException, then the Kafka
consumer is unable to return more messages from the socket byte
buffer. This can be fixed by enabling the Kafka consumers to have a
'streaming' API, where such large messages could be read in a
piecemeal fashion. But it is tricky and we don't have that feature
yet.

To avoid your Kafka consumer from getting into a bad state due to a
large message, you can set "max.message.size" to the largest possible
message size on your producer. Any message larger than that never
enters the Kafka cluster and hence never reaches a Kafka consumer.

>> > Seeing as it is used to aggregate log messages, is it safe to say message
> sizes of 2-100K are reasonable and won't cause any issues?

100K message sizes should work fine.

Thanks,
Neha

On Wed, Dec 21, 2011 at 12:15 PM, S Ahmed <sa...@gmail.com> wrote:
> Was kafka designed for a specific message size range?
>
> Seeing as it is used to aggregate log messages, is it safe to say message
> sizes of 2-100K are reasonable and won't cause any issues?