You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/03/02 06:33:18 UTC

[jira] [Comment Edited] (SPARK-12177) Update KafkaDStreams to new Kafka 0.9 Consumer API

    [ https://issues.apache.org/jira/browse/SPARK-12177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15175066#comment-15175066 ] 

Reynold Xin edited comment on SPARK-12177 at 3/2/16 5:32 AM:
-------------------------------------------------------------

This thread is getting to long for me to follow, but my instinct is that maybe we should have two subprojects and support both. Otherwise it is very bad for Kafka 0.8 users when upgrading to Spark 2.0.

It's much more difficult to upgrade Kafka which is a message bus than just upgrading Spark.


was (Author: rxin):
This thread is getting to long for me to follow, but my instinct is that maybe we should have two subprojects and support both.



> Update KafkaDStreams to new Kafka 0.9 Consumer API
> --------------------------------------------------
>
>                 Key: SPARK-12177
>                 URL: https://issues.apache.org/jira/browse/SPARK-12177
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.6.0
>            Reporter: Nikita Tarasenko
>              Labels: consumer, kafka
>
> Kafka 0.9 already released and it introduce new consumer API that not compatible with old one. So, I added new consumer api. I made separate classes in package org.apache.spark.streaming.kafka.v09 with changed API. I didn't remove old classes for more backward compatibility. User will not need to change his old spark applications when he uprgade to new Spark version.
> Please rewiew my changes



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org