You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Duc Hoa Nguyen (Jira)" <ji...@apache.org> on 2020/12/23 06:55:00 UTC

[jira] [Updated] (SPARK-33888) AVRO SchemaConverts - logicalType TimeMillis not being converted to Timestamp type

     [ https://issues.apache.org/jira/browse/SPARK-33888?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Duc Hoa Nguyen updated SPARK-33888:
-----------------------------------
    Description: 
We encountered the issue of Avro logical type of `TimeMillis` not being converted correctly to Spark `Timestamp` struct type using the `SchemaConverters`, but it converts to regular `int` instead. Reproducible by ingest data from MySQL table with a column of TIME type: Spark JDBC dataframe will get the correct type (Timestamp), but enforcing our avro schema (`{"type": "int"," logicalType": "time-millis"}`) externally will fail to apply with the following exception:

{{java.lang.RuntimeException: java.sql.Timestamp is not a valid external type for schema of int}}



  was:
We encountered the issue of Avro logical type of `TimeMillis` not being converted correctly to Spark `Timestamp` struct type using the `SchemaConverters`, but it converts to regular `int` instead. Reproducible by ingest data from MySQL table with a column of TIME type: Spark JDBC dataframe will get the correct type (Timestamp), but enforcing our avro schema (`{"type": "int","logicalType": "time-millis"}`) externally will fail to apply with the following exception:

{{java.lang.RuntimeException: java.sql.Timestamp is not a valid external type for schema of int}}




> AVRO SchemaConverts - logicalType TimeMillis not being converted to Timestamp type
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-33888
>                 URL: https://issues.apache.org/jira/browse/SPARK-33888
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.3, 3.0.0, 3.0.1
>            Reporter: Duc Hoa Nguyen
>            Priority: Minor
>
> We encountered the issue of Avro logical type of `TimeMillis` not being converted correctly to Spark `Timestamp` struct type using the `SchemaConverters`, but it converts to regular `int` instead. Reproducible by ingest data from MySQL table with a column of TIME type: Spark JDBC dataframe will get the correct type (Timestamp), but enforcing our avro schema (`{"type": "int"," logicalType": "time-millis"}`) externally will fail to apply with the following exception:
> {{java.lang.RuntimeException: java.sql.Timestamp is not a valid external type for schema of int}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org