You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/08/07 07:33:07 UTC

[GitHub] [spark] HeartSaVioR commented on a change in pull request #25135: [SPARK-28367][SS] Use new KafkaConsumer.poll API in Kafka connector

HeartSaVioR commented on a change in pull request #25135: [SPARK-28367][SS] Use new KafkaConsumer.poll API in Kafka connector
URL: https://github.com/apache/spark/pull/25135#discussion_r311406498
 
 

 ##########
 File path: external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaOffsetReader.scala
 ##########
 @@ -419,6 +416,21 @@ private[kafka010] class KafkaOffsetReader(
     stopConsumer()
     _consumer = null  // will automatically get reinitialized again
   }
+
+  private def getPartitions(): ju.Set[TopicPartition] = {
+    consumer.poll(jt.Duration.ZERO)
+    var partitions = consumer.assignment()
+    val startTimeMs = System.currentTimeMillis()
+    while (partitions.isEmpty && System.currentTimeMillis() - startTimeMs < pollTimeoutMs) {
+      // Poll to get the latest assigned partitions
+      consumer.poll(jt.Duration.ofMillis(100))
 
 Review comment:
   Good point. While Kafka doc says the behavior of such hack is indeterministic and Kafka never support it officially, we expect such behavior in any way.
   
   I've initiated thread to ask about viable alternatives of `poll(0)` and possibility of adding public API to update metadata only.
   https://lists.apache.org/thread.html/017cf631ef981ab1b494b1249be5c11d7edfe5f4867770a18188ebdc@%3Cdev.kafka.apache.org%3E

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org