You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/03/23 05:07:00 UTC

[jira] [Commented] (SPARK-31174) unix_timestamp() function returning NULL values for corner cases (daylight saving)

    [ https://issues.apache.org/jira/browse/SPARK-31174?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17064529#comment-17064529 ] 

Hyukjin Kwon commented on SPARK-31174:
--------------------------------------

Seems fixed in Spark's master. I presume it was fixed by replacing datetime APIs in Spark, which is difficult to backport. Let me resolve it as Cannot Reproduce (against the master branch) for now.

> unix_timestamp() function returning NULL values for corner cases (daylight saving)
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-31174
>                 URL: https://issues.apache.org/jira/browse/SPARK-31174
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, SQL
>    Affects Versions: 2.3.4
>            Reporter: Piotr Skąpski
>            Priority: Minor
>
> Running below code does not return 4 values due to one timestamp not being correctly created (possible problem with daylight saving):
> {code:java}
> spark.sql("""SELECT from_unixtime(unix_timestamp('2020-03-08 01:00:00'), 'yyyyMMdd') t1, from_unixtime(unix_timestamp('2020-03-08 02:00:00'), 'yyyyMMdd') t2, from_unixtime(unix_timestamp('2020-03-08 03:00:00'), 'yyyyMMdd') t3, from_unixtime(unix_timestamp('2020-03-08 04:00:00'), 'yyyyMMdd') t4""").show
> +--------+----+--------+--------+
> |      t1|  t2|      t3|      t4|
> +--------+----+--------+--------+
> |20200308|null|20200308|20200308|
> +--------+----+--------+--------+{code}
> This unexpected NULL value caused us problems as we did not expect it.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org