You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by " Lasse Nedergaard (Jira)" <ji...@apache.org> on 2019/09/17 17:53:00 UTC

[jira] [Created] (FLINK-14108) Support for Confluent Kafka schema registry for Avro serialisation

 Lasse Nedergaard created FLINK-14108:
-----------------------------------------

             Summary: Support for Confluent Kafka schema registry for Avro serialisation 
                 Key: FLINK-14108
                 URL: https://issues.apache.org/jira/browse/FLINK-14108
             Project: Flink
          Issue Type: New Feature
          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
    Affects Versions: 1.10.0
            Reporter:  Lasse Nedergaard


The current implementation in flink-avro-confluent-registry support deserialization with schema lookup in Confluent Kafka schema registry. 

I would like support for serialization as well, following the same structure as deserialization. With the feature it would be possible to use Confluent schema registry in Sink writing Avro to Kafka and at the same time register the schema used.

The test in TestAvroConsumerConfluent need to be updated together with the comment as it indicate it use Confluent schema registry for write, but the example code use SimpleStringSchema.

We have a running version, that we would like to give back to the community.

 



--
This message was sent by Atlassian Jira
(v8.3.2#803003)