You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cody Koeninger (JIRA)" <ji...@apache.org> on 2017/11/23 17:31:00 UTC

[jira] [Commented] (SPARK-22561) Dynamically update topics list for spark kafka consumer

    [ https://issues.apache.org/jira/browse/SPARK-22561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16264643#comment-16264643 ] 

Cody Koeninger commented on SPARK-22561:
----------------------------------------

See SubscribePattern

http://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html#consumerstrategies

> Dynamically update topics list for spark kafka consumer
> -------------------------------------------------------
>
>                 Key: SPARK-22561
>                 URL: https://issues.apache.org/jira/browse/SPARK-22561
>             Project: Spark
>          Issue Type: New Feature
>          Components: DStreams
>    Affects Versions: 2.1.0, 2.1.1, 2.2.0
>            Reporter: Arun
>
> The Spark Streaming application should allow to add new topic after streaming context is intialized and DStream is started.  This is very useful feature specially when business is working multi geography or  multi business units. 
> For example initially I have spark-kakfa consumer listening for topics: ["topic-1"."topic-2"] and after couple of days I have added new topics to kafka ["topic-3","topic-4"], now is there a way to update spark-kafka consumer topics list and ask spark-kafka consumer to consume data for updated list of topics without stopping sparkStreaming application or sparkStreaming context.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org