You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by "Mark Cox (Jira)" <ji...@apache.org> on 2020/05/23 00:24:00 UTC
[jira] [Created] (KAFKA-10034) Clarify Usage of "batch.size" and
"max.request.size" Producer Configs
Mark Cox created KAFKA-10034:
--------------------------------
Summary: Clarify Usage of "batch.size" and "max.request.size" Producer Configs
Key: KAFKA-10034
URL: https://issues.apache.org/jira/browse/KAFKA-10034
Project: Kafka
Issue Type: Improvement
Components: docs, producer
Reporter: Mark Cox
The documentation around the producer configurations "batch.size" and "max.request.size", and how they relate to one another, can be confusing.
In reality, the "max.request.size" is a hard limit on each individual record, but the documentation makes it seem this is the maximum size of a request sent to Kafka. If there is a situation where "batch.size" is set greater than "max.request.size" (and each individual record is smaller than "max.request.size") you could end up with larger requests than expected sent to Kafka.
There are a few things that could be considered to make this clearer:
# Improve the documentation to clarify the two producer configurations and how they relate to each other
# Provide a producer check, and possibly a warning, if "batch.size" is found to be greater than "max.request.size"
# The producer could take the _minimum_ of "batch.size" or "max.request.size"
--
This message was sent by Atlassian Jira
(v8.3.4#803005)