You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "Stephan Ewen (Jira)" <ji...@apache.org> on 2020/11/26 12:52:00 UTC

[jira] [Created] (FLINK-20379) New Kafka Connector does not support DeserializationSchema

Stephan Ewen created FLINK-20379:
------------------------------------

             Summary: New Kafka Connector does not support DeserializationSchema
                 Key: FLINK-20379
                 URL: https://issues.apache.org/jira/browse/FLINK-20379
             Project: Flink
          Issue Type: Bug
          Components: Connectors / Kafka
            Reporter: Stephan Ewen
             Fix For: 1.12.0


The new Kafka Connector defines its own deserialization schema and is incompatible with the existing library of deserializers.

That means that users cannot use all of Flink's Formats (Avro, JSON, Csv, Protobuf, Confluent Schema Registry, ...) with the new Kafka Connector.

I think we should change the new Kafka Connector to use the existing Deserialization classes, so all formats can be used, and users can reuse their deserializer implementations.

It would also be good to use the existing KafkaDeserializationSchema. Otherwise all users need to migrate their sources again.




--
This message was sent by Atlassian Jira
(v8.3.4#803005)