You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Chesnay Schepler (JIRA)" <ji...@apache.org> on 2019/01/31 14:03:00 UTC

[jira] [Updated] (FLINK-11160) Confluent Avro Serialization Schema

     [ https://issues.apache.org/jira/browse/FLINK-11160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chesnay Schepler updated FLINK-11160:
-------------------------------------
    Fix Version/s:     (was: 1.7.2)
                       (was: 1.8.0)

> Confluent Avro Serialization Schema 
> ------------------------------------
>
>                 Key: FLINK-11160
>                 URL: https://issues.apache.org/jira/browse/FLINK-11160
>             Project: Flink
>          Issue Type: Improvement
>          Components: Kafka Connector
>    Affects Versions: 1.7.0
>            Reporter: Zhenhao Li
>            Priority: Minor
>              Labels: newbie, pull-request-available, scala
>   Original Estimate: 24h
>          Time Spent: 10m
>  Remaining Estimate: 23h 50m
>
> Currently, Flink is missing Serialization Schema to work with the Confluent Avro format and the Confluent schema registry.
> I wrote something that solved this problem for the company I currently work at. I think it is nice to contribute something back to the community. It has been used in a Scala project, and the project has been deployed to production. 
> The new serialization schemas only serialize GenericRecord and users have to pass the Avro schema files to the constructors. It might be not flexible enough to cover a broader set of use cases. The keyed serialization schema works for only Scala key-value paris.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)