You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "DuBin (Jira)" <ji...@apache.org> on 2020/06/23 05:38:00 UTC

[jira] [Created] (FLINK-18414) Kafka Json connector in Table API support more option

DuBin created FLINK-18414:
-----------------------------

             Summary: Kafka Json connector in Table API support more option
                 Key: FLINK-18414
                 URL: https://issues.apache.org/jira/browse/FLINK-18414
             Project: Flink
          Issue Type: Improvement
          Components: Connectors / Kafka, Formats (JSON, Avro, Parquet, ORC, SequenceFile), Table SQL / Ecosystem
    Affects Versions: 1.10.1
            Reporter: DuBin


Currently, the Flink use a 'org.apache.flink.formats.json.JsonRowDeserializationSchema' to deserialize the record into Row if we define the Kafka Json Table Source.

But the parser is hard-coded in the class :

private final ObjectMapper objectMapper = new ObjectMapper();

Imagine that the Json data source contains data like this:

{"a":NaN,"b":1.2}

or it contains some dirty data, it will throw exception in the deserialize function all the time, because Kafka do not have a schema validation on Json format.

 

So can we add more options in the 'org.apache.flink.formats.json.JsonRowFormatFactory' , in the 'org.apache.flink.formats.json.JsonRowFormatFactory#createDeserializationSchema'? e.g. add more option for the objectMapper, some dirty data handler(just return an empty row, defined by the user)

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)