You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/10 08:31:21 UTC

[jira] [Resolved] (SPARK-13268) SQL Timestamp stored as GMT but toString returns GMT-08:00

     [ https://issues.apache.org/jira/browse/SPARK-13268?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-13268.
-------------------------------
    Resolution: Not A Problem

All of these classes are JDK classes.

> SQL Timestamp stored as GMT but toString returns GMT-08:00
> ----------------------------------------------------------
>
>                 Key: SPARK-13268
>                 URL: https://issues.apache.org/jira/browse/SPARK-13268
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Ilya Ganelin
>
> There is an issue with how timestamps are displayed/converted to Strings in Spark SQL. The documentation states that the timestamp should be created in the GMT time zone, however, if we do so, we see that the output actually contains a -8 hour offset:
> {code}
> new Timestamp(ZonedDateTime.parse("2015-01-01T00:00:00Z[GMT]").toInstant.toEpochMilli)
> res144: java.sql.Timestamp = 2014-12-31 16:00:00.0
> new Timestamp(ZonedDateTime.parse("2015-01-01T00:00:00Z[GMT-08:00]").toInstant.toEpochMilli)
> res145: java.sql.Timestamp = 2015-01-01 00:00:00.0
> {code}
> This result is confusing, unintuitive, and introduces issues when converting from DataFrames containing timestamps to RDDs which are then saved as text. This has the effect of essentially shifting all dates in a dataset by 1 day. 
> The suggested fix for this is to update the timestamp toString representation to either a) Include timezone or b) Correctly display in GMT.
> This change may well introduce substantial and insidious bugs so I'm not sure how best to resolve this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org