You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kousuke Saruta (Jira)" <ji...@apache.org> on 2021/09/11 13:12:00 UTC

[jira] [Updated] (SPARK-36723) day-time interval types should respect daylight saving time correctly

     [ https://issues.apache.org/jira/browse/SPARK-36723?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kousuke Saruta updated SPARK-36723:
-----------------------------------
    Description: 
In the current master, day-time interval types handle 24 hours as 1 day regardless of the time zone.
So, some operation with day-time interval data doesn't respect daylight saving time.
{code}
spark-sql> SET spark.sql.legacy.interval.enabled=false;
spark-sql> SET spark.sql.session.timeZone America/Los_Angeles

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '1' DAY;
2019-03-11 01:00:00 -- OK. Expected result.

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '24' HOUR;
2019-03-11 01:00:00 -- Not OK. 2019-03-11 02:00:00 is expected.
{code}

On the other hand, non-ANSI interval types properly handles daylight saving time.
{code}
spark-sql> SET spark.sql.legacy.interval.enabled=true;
spark-sql> SET spark.sql.session.timeZone America/Los_Angeles

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '1' DAY;
2019-03-11 01:00:00

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '24' HOUR;
2019-03-11 02:00:00
{code}

  was:
In the current master, day-time interval types handle 24 hours as 1 day regardless of the time zone.
So, some operation with day-time interval data doesn't respect daylight saving time.
{code}
spark-sql> SET spark.sql.legacy.interval.enabled=false;
spark-sql> spark.sql.session.timeZone America/Los_Angeles

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '1' DAY;
2019-03-11 01:00:00 -- OK. Expected result.

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '24' HOUR;
2019-03-11 01:00:00 -- Not OK. 2019-03-11 02:00:00 is expected.
{code}

On the other hand, non-ANSI interval types properly handles daylight saving time.
{code}
spark-sql> SET spark.sql.legacy.interval.enabled=true;
spark-sql> spark.sql.session.timeZone America/Los_Angeles

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '1' DAY;
2019-03-11 01:00:00

spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '24' HOUR;
2019-03-11 02:00:00
{code}


> day-time interval types should respect daylight saving time correctly
> ---------------------------------------------------------------------
>
>                 Key: SPARK-36723
>                 URL: https://issues.apache.org/jira/browse/SPARK-36723
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Kousuke Saruta
>            Priority: Major
>
> In the current master, day-time interval types handle 24 hours as 1 day regardless of the time zone.
> So, some operation with day-time interval data doesn't respect daylight saving time.
> {code}
> spark-sql> SET spark.sql.legacy.interval.enabled=false;
> spark-sql> SET spark.sql.session.timeZone America/Los_Angeles
> spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '1' DAY;
> 2019-03-11 01:00:00 -- OK. Expected result.
> spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '24' HOUR;
> 2019-03-11 01:00:00 -- Not OK. 2019-03-11 02:00:00 is expected.
> {code}
> On the other hand, non-ANSI interval types properly handles daylight saving time.
> {code}
> spark-sql> SET spark.sql.legacy.interval.enabled=true;
> spark-sql> SET spark.sql.session.timeZone America/Los_Angeles
> spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '1' DAY;
> 2019-03-11 01:00:00
> spark-sql> SELECT timestamp'2019-03-10 01:00:00' + INTERVAL '24' HOUR;
> 2019-03-11 02:00:00
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org