You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:15:16 UTC

[jira] [Resolved] (SPARK-21378) Spark Poll timeout when specific offsets are passed

     [ https://issues.apache.org/jira/browse/SPARK-21378?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-21378.
----------------------------------
    Resolution: Incomplete

> Spark Poll timeout when specific offsets are passed
> ---------------------------------------------------
>
>                 Key: SPARK-21378
>                 URL: https://issues.apache.org/jira/browse/SPARK-21378
>             Project: Spark
>          Issue Type: Bug
>          Components: DStreams
>    Affects Versions: 2.0.0, 2.0.2
>            Reporter: Ambud Sharma
>            Priority: Major
>              Labels: bulk-closed
>
> Kafka direct stream fails with poll timeout:
> {code:java}
> JavaInputDStream<ConsumerRecord<String, String>> stream = KafkaUtils.createDirectStream(ssc,
> 				LocationStrategies.PreferConsistent(),
> 				ConsumerStrategies.<String, String>Subscribe(topicsCollection, kafkaParams, fromOffsets));
> {code}
> Digging deeper shows that there's an assert statement such that if no records are returned (which is a valid case) then a failure will happen.
> https://github.com/apache/spark/blob/39e2bad6a866d27c3ca594d15e574a1da3ee84cc/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/CachedKafkaConsumer.scala#L75
> This solution: https://issues.apache.org/jira/browse/SPARK-19275 keeps getting "Added jobs for time" and eventually leads to "Failed to get records for spark-xxxxx after polling for 3000"; in this case batch size is 3seconds
> We can increase it to an even bigger number which leads to OOM.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org