You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by eugene miretsky <eu...@gmail.com> on 2015/12/23 22:27:03 UTC

Kafka consumer: Upgrading to use the the new Java Consumer

Hi,

The Kafka connector currently uses the older Kafka Scala consumer. Kafka
0.9 came out with a new Java Kafka consumer.

One of the main differences is that the Scala consumer uses
a Decoder( kafka.serializer.decoder) trait to decode keys/values while
the Java consumer uses the  Deserializer interface
(org.apache.kafka.common.serialization.deserializer).

The main difference between Decoder and Deserializer is that
Deserializer.deserialize accepts a topic and a payload while Decoder.decode
accepts only a payload. Topics in Kafka are pretty useful, as one example:
Confluent Schema Registry uses topic names to find the schema for each
key/value - while Confluent does provide a Decoder implementation, it is
mostly a hack that is incompatible  with the new Kafka Java Producer.

Any thoughts about changing the Kafka connector to work with the new Kafka
Java Consumer?

Cheers,
Eugene

Re: Kafka consumer: Upgrading to use the the new Java Consumer

Posted by Cody Koeninger <co...@koeninger.org>.
Have you seen
SPARK-12177

On Wed, Dec 23, 2015 at 3:27 PM, eugene miretsky <eu...@gmail.com>
wrote:

> Hi,
>
> The Kafka connector currently uses the older Kafka Scala consumer. Kafka
> 0.9 came out with a new Java Kafka consumer.
>
> One of the main differences is that the Scala consumer uses
> a Decoder( kafka.serializer.decoder) trait to decode keys/values while
> the Java consumer uses the  Deserializer interface
> (org.apache.kafka.common.serialization.deserializer).
>
> The main difference between Decoder and Deserializer is that
> Deserializer.deserialize accepts a topic and a payload while Decoder.decode
> accepts only a payload. Topics in Kafka are pretty useful, as one example:
> Confluent Schema Registry uses topic names to find the schema for each
> key/value - while Confluent does provide a Decoder implementation, it is
> mostly a hack that is incompatible  with the new Kafka Java Producer.
>
> Any thoughts about changing the Kafka connector to work with the new Kafka
> Java Consumer?
>
> Cheers,
> Eugene
>