You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vinitha Reddy Gankidi (JIRA)" <ji...@apache.org> on 2018/07/10 18:48:00 UTC

[jira] [Created] (SPARK-24778) DateTimeUtils.getTimeZone method returns GMT time if timezone cannot be parsed

Vinitha Reddy Gankidi created SPARK-24778:
---------------------------------------------

             Summary: DateTimeUtils.getTimeZone method returns GMT time if timezone cannot be parsed
                 Key: SPARK-24778
                 URL: https://issues.apache.org/jira/browse/SPARK-24778
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.3.1
            Reporter: Vinitha Reddy Gankidi


`DateTimeUtils.getTimeZone` calls java's `Timezone.getTimezone` method that defaults to GMT if the timezone cannot be parsed. This can be misleading for users and its better to return NULL instead of returning an incorrect value. 

To reproduce: `from_utc_timestamp` is one of the functions that calls `DateTimeUtils.getTimeZone`. Session timezone is GMT for the following queries.
{code:java}
SELECT from_utc_timestamp('2018-07-10 12:00:00', 'GMT+05:00') -> 2018-07-10 17:00:00 
SELECT from_utc_timestamp('2018-07-10 12:00:00', '+05:00') -> 2018-07-10 12:00:00 (Defaults to GMT as the timezone is not recognized){code}
We could fix it by using the workaround mentioned here: [https://bugs.openjdk.java.net/browse/JDK-4412864 |https://bugs.openjdk.java.net/browse/JDK-4412864]

A side note: 
{code:java}
select cast( '2018-07-09 22:00:00+05:00' as timestamp) -> 2018-07-09 17:00:00{code}
Casting the value as timestamp actually returns the correct value  because it invokes `DateTimeUtils.stringToTimestamp` which supports the time offsets [+/-][h]h:[m]m. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org