You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Jark Wu (Jira)" <ji...@apache.org> on 2020/12/04 03:06:00 UTC

[jira] [Assigned] (FLINK-20470) MissingNode can't be casted to ObjectNode when deserializing JSON

     [ https://issues.apache.org/jira/browse/FLINK-20470?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jark Wu reassigned FLINK-20470:
-------------------------------

    Assignee: zhuxiaoshang

> MissingNode can't be casted to ObjectNode when deserializing JSON
> -----------------------------------------------------------------
>
>                 Key: FLINK-20470
>                 URL: https://issues.apache.org/jira/browse/FLINK-20470
>             Project: Flink
>          Issue Type: Bug
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile), Table SQL / Ecosystem
>    Affects Versions: 1.12.0, 1.11.2
>            Reporter: Jark Wu
>            Assignee: zhuxiaoshang
>            Priority: Major
>
> {code}
> Caused by: java.io.IOException: Failed to deserialize JSON ''.
>         at
> org.apache.flink.formats.json.JsonRowDataDeserializationSchema.deserialize(JsonRowDataDeserializationSchema.java:126)
> ~[flink-json-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.formats.json.JsonRowDataDeserializationSchema.deserialize(JsonRowDataDeserializationSchema.java:76)
> ~[flink-json-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.api.common.serialization.DeserializationSchema.deserialize(DeserializationSchema.java:81)
> ~[flink-dist_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.connectors.kafka.internals.KafkaDeserializationSchemaWrapper.deserialize(KafkaDeserializationSchemaWrapper.java:56)
> ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:181)
> ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:141)
> ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:755)
> ~[flink-sql-connector-kafka_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
> ~[flink-dist_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
> ~[flink-dist_2.11-1.11.2.jar:1.11.2]
>         at
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
> ~[flink-dist_2.11-1.11.2.jar:1.11.2]
> Caused by: java.lang.ClassCastException:
> org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.MissingNode
> cannot be cast to
> org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
> {code}
> Currently, we only check {{jsonNode == null || jsonNode.isNull()}} for nullable node, I think we should also take MissingNode into account. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)