You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Kenneth William Krugler (Jira)" <ji...@apache.org> on 2020/05/06 22:41:00 UTC
[jira] [Commented] (FLINK-17478) Avro format logical type
conversions do not work due to type mismatch
[ https://issues.apache.org/jira/browse/FLINK-17478?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17101239#comment-17101239 ]
Kenneth William Krugler commented on FLINK-17478:
-------------------------------------------------
Hi [~gyfora] - is this related to FLINK-17486?
> Avro format logical type conversions do not work due to type mismatch
> ---------------------------------------------------------------------
>
> Key: FLINK-17478
> URL: https://issues.apache.org/jira/browse/FLINK-17478
> Project: Flink
> Issue Type: Sub-task
> Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile), Table SQL / Planner
> Affects Versions: 1.10.0
> Reporter: Gyula Fora
> Priority: Major
>
> We hit the following issue when trying to use avro logical timestamp types:
>
> {code:java}
> CREATE TABLE source_table (
> int_field INT,
> timestamp_field TIMESTAMP(3)
> ) WITH (
> 'connector.type' = 'kafka',
> 'connector.version' = 'universal',
> 'connector.topic' = 'avro_tset',
> 'connector.properties.bootstrap.servers' = '<...>',
> 'format.type' = 'avro',
> 'format.avro-schema' =
> '{
> "type": "record",
> "name": "test",
> "fields" : [
> {"name": "int_field", "type": "int"},
> {"name": "timestamp_field", "type": {"type":"long", "logicalType": "timestamp-millis"}}
> ]
> }'
> )
>
> INSERT INTO source_table VALUES (12, TIMESTAMP '1999-11-11 11:11:11');
> {code}
>
> And the error:
> {noformat}
> Caused by: java.lang.ClassCastException: java.time.LocalDateTime cannot be cast to java.lang.Long at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:131) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:72) at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:166) at org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:90) at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62) at org.apache.flink.formats.avro.AvroRowSerializationSchema.serialize(AvroRowSerializationSchema.java:143){noformat}
> Dawid's analysis from the ML discussion:
> It seems that the information about the bridging class (java.sql.Timestamp in this case) is lost in the stack. Because this information is lost/not respected the planner produces LocalDateTime instead of a proper java.sql.Timestamp time. The AvroRowSerializationSchema expects java.sql.Timestamp for a column of TIMESTAMP type and thus it fails for LocalDateTime.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)