You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/03/11 13:17:00 UTC

[jira] [Resolved] (SPARK-31076) Convert Catalyst's DATE/TIMESTAMP to Java Date/Timestamp via local date-time

     [ https://issues.apache.org/jira/browse/SPARK-31076?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-31076.
---------------------------------
    Fix Version/s: 3.0.0
       Resolution: Fixed

Issue resolved by pull request 27807
[https://github.com/apache/spark/pull/27807]

> Convert Catalyst's DATE/TIMESTAMP to Java Date/Timestamp via local date-time
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-31076
>                 URL: https://issues.apache.org/jira/browse/SPARK-31076
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Maxim Gekk
>            Assignee: Maxim Gekk
>            Priority: Major
>             Fix For: 3.0.0
>
>
> By default, collect() returns java.sql.Timestamp/Date instances with offsets derived from internal values of Catalyst's TIMESTAMP/DATE that store microseconds since the epoch. The conversion from internal values to java.sql.Timestamp/Date based on Proleptic Gregorian calendar but converting the resulted values before 1582 year to strings produces timestamp/date string in Julian calendar. For example:
> {code}
> scala> sql("select date '1100-10-10'").collect()
> res1: Array[org.apache.spark.sql.Row] = Array([1100-10-03])
> {code} 
> This can be fixed if internal Catalyst's values are converted to local date-time in Gregorian calendar,  and construct local date-time from the resulted year, month, ..., seconds in Julian calendar.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org