You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "Leonard Xu (Jira)" <ji...@apache.org> on 2020/02/13 15:15:00 UTC

[jira] [Created] (FLINK-16048) Support read avro data that serialized by KafkaAvroSerializer from Kafka in Table

Leonard Xu created FLINK-16048:
----------------------------------

             Summary: Support read avro data that serialized by KafkaAvroSerializer from Kafka in Table
                 Key: FLINK-16048
                 URL: https://issues.apache.org/jira/browse/FLINK-16048
             Project: Flink
          Issue Type: Improvement
          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
    Affects Versions: 1.11.0
            Reporter: Leonard Xu
             Fix For: 1.11.0


 found SQL Kafka connector can not consume avro data that was serialized by `KafkaAvroSerializer` and only can consume Row data with avro schema because we use `AvroRowDeserializationSchema/AvroRowSerializationSchema` to se/de data in  `AvroRowFormatFactory`. 

I think we should support this because `KafkaAvroSerializer` is very common in Kafka.

and someone met same question in stackoverflow[1].


[[1]https://stackoverflow.com/questions/56452571/caused-by-org-apache-avro-avroruntimeexception-malformed-data-length-is-negat/56478259|https://stackoverflow.com/questions/56452571/caused-by-org-apache-avro-avroruntimeexception-malformed-data-length-is-negat/56478259]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)