You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maxim Gekk (Jira)" <ji...@apache.org> on 2019/10/10 21:45:00 UTC
[jira] [Commented] (SPARK-26651) Use Proleptic Gregorian calendar
[ https://issues.apache.org/jira/browse/SPARK-26651?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16948957#comment-16948957 ]
Maxim Gekk commented on SPARK-26651:
------------------------------------
[~jiangxb] Could you consider this for including to the major changes lists of Spark 3.0
> Use Proleptic Gregorian calendar
> --------------------------------
>
> Key: SPARK-26651
> URL: https://issues.apache.org/jira/browse/SPARK-26651
> Project: Spark
> Issue Type: Umbrella
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: Maxim Gekk
> Assignee: Maxim Gekk
> Priority: Major
> Labels: ReleaseNote
>
> Spark 2.4 and previous versions use a hybrid calendar - Julian + Gregorian in date/timestamp parsing, functions and expressions. The ticket aims to switch Spark on Proleptic Gregorian calendar, and use java.time classes introduced in Java 8 for timestamp/date manipulations. One of the purpose of switching on Proleptic Gregorian calendar is to conform to SQL standard which supposes such calendar.
> *Release note:*
> Spark 3.0 has switched on Proleptic Gregorian calendar in parsing, formatting, and converting dates and timestamps as well as in extracting sub-components like years, days and etc. It uses Java 8 API classes from the java.time packages that based on [ISO chronology |https://docs.oracle.com/javase/8/docs/api/java/time/chrono/IsoChronology.html]. Previous versions of Spark performed those operations by using [the hybrid calendar|https://docs.oracle.com/javase/7/docs/api/java/util/GregorianCalendar.html] (Julian + Gregorian). The changes might impact on the results for dates and timestamps before October 15, 1582 (Gregorian).
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org