You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maxim Gekk (Jira)" <ji...@apache.org> on 2020/03/24 20:15:00 UTC

[jira] [Created] (SPARK-31237) Replace 3-letter time zones by zone offsets

Maxim Gekk created SPARK-31237:
----------------------------------

             Summary: Replace 3-letter time zones by zone offsets
                 Key: SPARK-31237
                 URL: https://issues.apache.org/jira/browse/SPARK-31237
             Project: Spark
          Issue Type: Test
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Maxim Gekk


3-letter time zones are ambitious, and have been already deprecated in JDK, see [https://docs.oracle.com/javase/8/docs/api/java/util/TimeZone.html] . Also, some short names are mapped to region-based zone IDs, and don't conform to actual definitions. For example, the PST short name is mapped to America/Los_Angeles. It has different zone offsets in Java 7 and Java 8 APIs:
{code:scala}
scala> TimeZone.getTimeZone("PST").getOffset(Timestamp.valueOf("2016-11-05 23:00:00").getTime)/3600000.0
res11: Double = -7.0
scala> TimeZone.getTimeZone("PST").getOffset(Timestamp.valueOf("2016-11-06 00:00:00").getTime)/3600000.0
res12: Double = -7.0
scala> TimeZone.getTimeZone("PST").getOffset(Timestamp.valueOf("2016-11-06 01:00:00").getTime)/3600000.0
res13: Double = -8.0
scala> TimeZone.getTimeZone("PST").getOffset(Timestamp.valueOf("2016-11-06 02:00:00").getTime)/3600000.0
res14: Double = -8.0
scala> TimeZone.getTimeZone("PST").getOffset(Timestamp.valueOf("2016-11-06 03:00:00").getTime)/3600000.0
res15: Double = -8.0
{code}
and in Java 8 API https://github.com/apache/spark/pull/27980#discussion_r396287278

By definition, PST must be a constant and equals to UTC-08:00, see https://www.timeanddate.com/time/zones/pst

The ticket aims to replace all short time zone names by zone offsets in tests.
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org