You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Elias Levy <fe...@gmail.com> on 2019/09/12 15:45:21 UTC

Re: Kafka Schema registry

Just for a Kafka source:

https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema


   - There is also a version of this schema available that can lookup the
   writer’s schema (schema which was used to write the record) in Confluent
   Schema Registry
   <https://docs.confluent.io/current/schema-registry/docs/index.html>.
   Using these deserialization schema record will be read with the schema that
   was retrieved from Schema Registry and transformed to a statically
   provided( either through
   ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or
   ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).


On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard <la...@gmail.com>
wrote:

> Hi.
> Do Flink have out of the Box Support for Kafka Schema registry for both
> sources and sinks?
> If not, does anyone knows about a implementation we can build on so we can
> help make it general available in a future release.
>
> Med venlig hilsen / Best regards
> Lasse Nedergaard
>
>

Re: Kafka Schema registry

Posted by aj <aj...@gmail.com>.
 ConfluentRegistryAvroDeserializationSchema.forGeneric() is require reader
schema .How we can used it deseralize using writer schema.

On Fri, Sep 13, 2019 at 12:04 AM Lasse Nedergaard <la...@gmail.com>
wrote:

> Hi Elias
>
> Thanks for letting me know. I have found it but we also need the option to
> register Avro Schema’s and use the registry when we write to Kafka. So we
> will create a serialisation version and when it works implement it into
> Flink and create a pull request for the community.
>
> Med venlig hilsen / Best regards
> Lasse Nedergaard
>
>
> Den 12. sep. 2019 kl. 17.45 skrev Elias Levy <fearsome.lucidity@gmail.com
> >:
>
> Just for a Kafka source:
>
>
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema
>
>
>    - There is also a version of this schema available that can lookup the
>    writer’s schema (schema which was used to write the record) in Confluent
>    Schema Registry
>    <https://docs.confluent.io/current/schema-registry/docs/index.html>.
>    Using these deserialization schema record will be read with the schema that
>    was retrieved from Schema Registry and transformed to a statically
>    provided( either through
>    ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or
>    ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).
>
>
> On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard <
> lassenedergaard@gmail.com> wrote:
>
>> Hi.
>> Do Flink have out of the Box Support for Kafka Schema registry for both
>> sources and sinks?
>> If not, does anyone knows about a implementation we can build on so we
>> can help make it general available in a future release.
>>
>> Med venlig hilsen / Best regards
>> Lasse Nedergaard
>>
>>

-- 
Thanks & Regards,
Anuj Jain
Mob. : +91- 8588817877
Skype : anuj.jain07
<http://www.oracle.com/>


<http://www.cse.iitm.ac.in/%7Eanujjain/>

Re: Kafka Schema registry

Posted by Lasse Nedergaard <la...@gmail.com>.
Hi Elias

Thanks for letting me know. I have found it but we also need the option to register Avro Schema’s and use the registry when we write to Kafka. So we will create a serialisation version and when it works implement it into Flink and create a pull request for the community. 

Med venlig hilsen / Best regards
Lasse Nedergaard


> Den 12. sep. 2019 kl. 17.45 skrev Elias Levy <fe...@gmail.com>:
> 
> Just for a Kafka source:
> 
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema
> 
> There is also a version of this schema available that can lookup the writer’s schema (schema which was used to write the record) in Confluent Schema Registry. Using these deserialization schema record will be read with the schema that was retrieved from Schema Registry and transformed to a statically provided( either through ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).
> 
>> On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard <la...@gmail.com> wrote:
>> Hi. 
>> Do Flink have out of the Box Support for Kafka Schema registry for both sources and sinks?
>> If not, does anyone knows about a implementation we can build on so we can help make it general available in a future release. 
>> 
>> Med venlig hilsen / Best regards
>> Lasse Nedergaard
>>