You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2019/02/16 18:28:00 UTC

[jira] [Assigned] (SPARK-26902) Support java.time.Instant as an external type of TimestampType

     [ https://issues.apache.org/jira/browse/SPARK-26902?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-26902:
------------------------------------

    Assignee: Apache Spark

> Support java.time.Instant as an external type of TimestampType
> --------------------------------------------------------------
>
>                 Key: SPARK-26902
>                 URL: https://issues.apache.org/jira/browse/SPARK-26902
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Maxim Gekk
>            Assignee: Apache Spark
>            Priority: Major
>
> Currently, Spark supports the java.sql.Date and java.sql.Timestamp types as external types for Catalyst's DateType and TimestampType. It accepts and produces values of such types. Since Java 8, base classes for dates and timestamps are java.time.Instant, java.time.LocalDate/LocalDateTime, and java.time.ZonedDateTime. Need to add new converters from/to Instant.
> The Instant type holds epoch seconds (and nanoseconds), and directly reflects to Catalyst's TimestampType.
> Main motivations for the changes:
> - Smoothly support Java 8 time API
> - Avoid inconsistency of calendars used inside Spark 3.0 (Proleptic Gregorian calendar) and inside of java.sql.Timestamp (hybrid calendar - Julian + Gregorian). 
> - Make conversion independent from current system timezone.
> In case of collecting values of Date/TimestampType, the following SQL config can control types of returned values:
>  - spark.sql.catalyst.timestampType with supported values "java.sql.Timestamp" (by default) and "java.time.Instant"



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org