You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Aljoscha Krettek (Jira)" <ji...@apache.org> on 2019/09/27 08:40:00 UTC

[jira] [Assigned] (FLINK-14108) Support for Confluent Kafka schema registry for Avro serialisation

     [ https://issues.apache.org/jira/browse/FLINK-14108?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aljoscha Krettek reassigned FLINK-14108:
----------------------------------------

    Assignee:  Lasse Nedergaard

> Support for Confluent Kafka schema registry for Avro serialisation 
> -------------------------------------------------------------------
>
>                 Key: FLINK-14108
>                 URL: https://issues.apache.org/jira/browse/FLINK-14108
>             Project: Flink
>          Issue Type: New Feature
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>    Affects Versions: 1.10.0
>            Reporter:  Lasse Nedergaard
>            Assignee:  Lasse Nedergaard
>            Priority: Minor
>
> The current implementation in flink-avro-confluent-registry support deserialization with schema lookup in Confluent Kafka schema registry. 
> I would like support for serialization as well, following the same structure as deserialization. With the feature it would be possible to use Confluent schema registry in Sink writing Avro to Kafka and at the same time register the schema used.
> The test in TestAvroConsumerConfluent need to be updated together with the comment as it indicate it use Confluent schema registry for write, but the example code use SimpleStringSchema.
> We have a running version, that we would like to give back to the community.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)