You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "handong (Jira)" <ji...@apache.org> on 2021/08/19 11:44:00 UTC
[jira] [Updated] (SPARK-36545) sparkstreaming input rate exceed
spark.streaming.kafka.maxRatePerPartition
[ https://issues.apache.org/jira/browse/SPARK-36545?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
handong updated SPARK-36545:
----------------------------
Component/s: YARN
Spark Submit
Description:
i have a spark streaming application with kafka .
Here are the parameters:
kafka partition = 500
batch time = 60
--conf spark.streaming.backpressure.enabled=true
--conf spark.streaming.kafka.maxRatePerPartition=2500
input size= 500 * 120 * 2500 = 75,000,000
however input size become 160000000 after some batch
> sparkstreaming input rate exceed spark.streaming.kafka.maxRatePerPartition
> ---------------------------------------------------------------------------
>
> Key: SPARK-36545
> URL: https://issues.apache.org/jira/browse/SPARK-36545
> Project: Spark
> Issue Type: Bug
> Components: DStreams, Spark Submit, YARN
> Affects Versions: 2.4.5
> Reporter: handong
> Priority: Major
>
> i have a spark streaming application with kafka .
> Here are the parameters:
> kafka partition = 500
> batch time = 60
> --conf spark.streaming.backpressure.enabled=true
> --conf spark.streaming.kafka.maxRatePerPartition=2500
> input size= 500 * 120 * 2500 = 75,000,000
> however input size become 160000000 after some batch
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org