You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/02/18 13:32:00 UTC

[jira] [Assigned] (SPARK-30857) Wrong truncations of timestamps before the epoch to hours and days

     [ https://issues.apache.org/jira/browse/SPARK-30857?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan reassigned SPARK-30857:
-----------------------------------

    Assignee: Maxim Gekk

> Wrong truncations of timestamps before the epoch to hours and days
> ------------------------------------------------------------------
>
>                 Key: SPARK-30857
>                 URL: https://issues.apache.org/jira/browse/SPARK-30857
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.3.4, 2.4.5
>            Reporter: Maxim Gekk
>            Assignee: Maxim Gekk
>            Priority: Major
>              Labels: correctness
>
> Truncations to seconds and minutes of timestamps after the epoch are correct:
> {code:sql}
> spark-sql> select date_trunc('HOUR', '2020-02-11 00:01:02.123'), date_trunc('HOUR', '2020-02-11 00:01:02.789');
> 2020-02-11 00:00:00	2020-02-11 00:00:00
> {code}
> but truncations of timestamps before the epoch are incorrect:
> {code:sql}
> spark-sql> select date_trunc('HOUR', '1960-02-11 00:01:02.123'), date_trunc('HOUR', '1960-02-11 00:01:02.789');
> 1960-02-11 01:00:00	1960-02-11 01:00:00
> {code}
> The result must be *1960-02-11 00:00:00 1960-02-11 00:00:00*
> The same for the DAY level:
> {code:sql}
> spark-sql> select date_trunc('DAY', '1960-02-11 00:01:02.123'), date_trunc('DAY', '1960-02-11 00:01:02.789');
> 1960-02-12 00:00:00	1960-02-12 00:00:00
> {code}
> The result must be *1960-02-11 00:00:00 1960-02-11 00:00:00*



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org