You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Anishek Agarwal <an...@gmail.com> on 2015/11/24 12:25:58 UTC

Compressed message size limits

Hello,

i am trying to send compressed messages to kafka. topic/broker
configurations are default, i have provided "compression.type" of "snappy"
on kafka producer. the uncompressed message size is 1160668 bytes, error i
get is

*org.apache.kafka.common.errors.RecordTooLargeException: The message is
1160694 bytes when serialized which is larger than the maximum request size
you have configured with the max.request.size configuration.*

difference i am assuming is the header etc

i am using org.apache.kafka:kafka-clients:0.8.2.1

I would think the message size validation would be done after the message
is compressed ?

i looked at the source code and its done on the raw serialized value and
not the compressed value ? is this correct behavior ?  its on tag 0.8.2.1 -
KafkaProducer.java (line 335)

seems to be related @ https://issues.apache.org/jira/browse/KAFKA-1718

Please do let me know if i am interpreting this wrong?

I also looked at the kafka-python code a bit and seems that it does not do
the validation before sending messages and it creates the message set using
the codec specified

thanks
anishek