You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2019/03/01 03:06:00 UTC
[jira] [Resolved] (SPARK-27008) Support java.time.LocalDate as an
external type of DateType
[ https://issues.apache.org/jira/browse/SPARK-27008?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-27008.
---------------------------------
Resolution: Fixed
Fix Version/s: 3.0.0
Issue resolved by pull request 23913
[https://github.com/apache/spark/pull/23913]
> Support java.time.LocalDate as an external type of DateType
> -----------------------------------------------------------
>
> Key: SPARK-27008
> URL: https://issues.apache.org/jira/browse/SPARK-27008
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: Maxim Gekk
> Assignee: Maxim Gekk
> Priority: Major
> Fix For: 3.0.0
>
>
> Currently, Spark supports the java.sql.Date as external types for Catalyst's DateType. It accepts and produces values of such types. Since Java 8, base classes for dates are java.time.Instant, java.time.LocalDate/LocalDateTime, and java.time.ZonedDateTime. Need to add new converters from/to LocalDate.
> The LocalDate type holds epoch days, and directly reflects to Catalyst's DateType.
> Main motivations for the changes:
> - Smoothly support Java 8 time API
> - Avoid inconsistency of calendars used inside Spark 3.0 (Proleptic Gregorian calendar) and inside of java.sql.Date (hybrid calendar - Julian + Gregorian).
> - Make conversion independent from current system timezone.
> In case of collecting values of DateType, the following SQL config can control types of returned values:
> - spark.sql.catalyst.dateType with supported values "Date" (by default, java.sql.Date) and "Instant" (java.time.LocalDate)
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org