You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Bo Sun <do...@gmail.com> on 2013/01/29 09:18:19 UTC
Payload size exception
hi all
i've got a exception .
kafka.common.MessageSizeTooLargeException: payload size of 1772597 larger
than 1000000
at
kafka.message.ByteBufferMessageSet.verifyMessageSize(ByteBufferMessageSet.scala:93)
at kafka.producer.SyncProducer.send(SyncProducer.scala:122)
at
kafka.producer.ProducerPool$$anonfun$send$1.apply$mcVI$sp(ProducerPool.scala:114)
at
kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
at
kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
at kafka.producer.ProducerPool.send(ProducerPool.scala:100)
at kafka.producer.Producer.zkSend(Producer.scala:140)
at kafka.producer.Producer.send(Producer.scala:99)
at kafka.javaapi.producer.Producer.send(Producer.scala:103)
i dont know why. plz help me .thanks
Re: Payload size exception
Posted by Jay Kreps <ja...@gmail.com>.
Ack, right you are Neha, my bad.
WRT to how to set the maximum there are two considerations:
1. It should be smaller then the fetch size your consumers use
2. Messages are fully instantiated in memory so obscenely large messages
(say hundreds of mb) will cause a lot of memory allocation churn/problems.
-Jay
On Tue, Jan 29, 2013 at 8:57 AM, S Ahmed <sa...@gmail.com> wrote:
> Neha/Jay,
>
> At linkedin, what is the largest payload size per message you guys have in
> production? My app might have like 20-100 kilobytes in size and I am
> hoping to get an idea if others have large messages like this for any
> production use case.
>
>
> On Tue, Jan 29, 2013 at 11:35 AM, Neha Narkhede <neha.narkhede@gmail.com
> >wrote:
>
> > > In 0.7.x this
> > > setting is controlled by the broker configuration max.message.size.
> > >
> >
> > Actually, in 0.7.x this setting is controlled by max.message.size on the
> > producer. In 0.8, we moved this setting to the broker.
> >
> > Thanks,
> > Neha
> >
>
Re: Payload size exception
Posted by S Ahmed <sa...@gmail.com>.
Ok so it might be an issue somewhere in the pipeline (I'm guessing memory
issues?).
They are xml files, and that 30-100 was uncompressed.
On Tue, Jan 29, 2013 at 12:28 PM, Neha Narkhede <ne...@gmail.com>wrote:
> > At linkedin, what is the largest payload size per message you guys have
> in
> > production?
> >
>
> Roughly 30K after compression, but that is fairly rare. Most messages are <
> 500 bytes after compression.
>
> Thanks,
> Neha
>
Re: Payload size exception
Posted by Neha Narkhede <ne...@gmail.com>.
> At linkedin, what is the largest payload size per message you guys have in
> production?
>
Roughly 30K after compression, but that is fairly rare. Most messages are <
500 bytes after compression.
Thanks,
Neha
Re: Payload size exception
Posted by Xavier Stevens <xs...@mozilla.com>.
Not quite in production yet, but we have payloads in the 30KB+ range. I
just added a
max.message.size to the broker's server.properties.
-Xavier
On 1/29/13 8:57 AM, S Ahmed wrote:
> Neha/Jay,
>
> At linkedin, what is the largest payload size per message you guys have in
> production? My app might have like 20-100 kilobytes in size and I am
> hoping to get an idea if others have large messages like this for any
> production use case.
>
>
> On Tue, Jan 29, 2013 at 11:35 AM, Neha Narkhede <ne...@gmail.com>wrote:
>
>>> In 0.7.x this
>>> setting is controlled by the broker configuration max.message.size.
>>>
>> Actually, in 0.7.x this setting is controlled by max.message.size on the
>> producer. In 0.8, we moved this setting to the broker.
>>
>> Thanks,
>> Neha
>>
Re: Payload size exception
Posted by S Ahmed <sa...@gmail.com>.
Neha/Jay,
At linkedin, what is the largest payload size per message you guys have in
production? My app might have like 20-100 kilobytes in size and I am
hoping to get an idea if others have large messages like this for any
production use case.
On Tue, Jan 29, 2013 at 11:35 AM, Neha Narkhede <ne...@gmail.com>wrote:
> > In 0.7.x this
> > setting is controlled by the broker configuration max.message.size.
> >
>
> Actually, in 0.7.x this setting is controlled by max.message.size on the
> producer. In 0.8, we moved this setting to the broker.
>
> Thanks,
> Neha
>
Re: Payload size exception
Posted by Neha Narkhede <ne...@gmail.com>.
> In 0.7.x this
> setting is controlled by the broker configuration max.message.size.
>
Actually, in 0.7.x this setting is controlled by max.message.size on the
producer. In 0.8, we moved this setting to the broker.
Thanks,
Neha
Re: Payload size exception
Posted by Jay Kreps <ja...@gmail.com>.
There is a setting that controls the maximum message size. This is to
ensure the messages can be read on the server and by all consumers without
running out of memory or exceeding the consumer fetch size. In 0.7.x this
setting is controlled by the broker configuration max.message.size.
-Jay
On Tue, Jan 29, 2013 at 12:18 AM, Bo Sun <do...@gmail.com> wrote:
> hi all
> i've got a exception .
> kafka.common.MessageSizeTooLargeException: payload size of 1772597 larger
> than 1000000
> at
>
> kafka.message.ByteBufferMessageSet.verifyMessageSize(ByteBufferMessageSet.scala:93)
> at kafka.producer.SyncProducer.send(SyncProducer.scala:122)
> at
>
> kafka.producer.ProducerPool$$anonfun$send$1.apply$mcVI$sp(ProducerPool.scala:114)
> at
> kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
> at
> kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
> at
>
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
> at
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
> at kafka.producer.ProducerPool.send(ProducerPool.scala:100)
> at kafka.producer.Producer.zkSend(Producer.scala:140)
> at kafka.producer.Producer.send(Producer.scala:99)
> at kafka.javaapi.producer.Producer.send(Producer.scala:103)
>
> i dont know why. plz help me .thanks
>