You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/08/30 04:09:00 UTC

[jira] [Resolved] (SPARK-28918) from_utc_timestamp function is mistakenly considering DST for Brazil in 2019

     [ https://issues.apache.org/jira/browse/SPARK-28918?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-28918.
----------------------------------
    Resolution: Not A Problem

> from_utc_timestamp function is mistakenly considering DST for Brazil in 2019
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-28918
>                 URL: https://issues.apache.org/jira/browse/SPARK-28918
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.3
>         Environment: I'm using Spark through Databricks
>            Reporter: Luiz Hissashi da Rocha
>            Priority: Minor
>
> I realized that *from_utc_timestamp* function is assuming that Brazil will have DST in 2019 but it will not, unlike previous years. Because of that, when I run the function bellow, instead of having "2019-11-14" (São Paulo is UTC-3h), I still get "2019-11-15T00:18:01" wrongly (as if it was UTC-2h due to DST).
> {code:java}
> // from_utc_timestamp("2019-11-15T02:18:01.000+0000", 'America/Sao_Paulo')
> {code}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org