You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/03/05 08:58:01 UTC
[jira] [Resolved] (SPARK-30668) to_timestamp failed to parse
2020-01-27T20:06:11.847-0800 using pattern "yyyy-MM-dd'T'HH:mm:ss.SSSz"
[ https://issues.apache.org/jira/browse/SPARK-30668?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-30668.
---------------------------------
Resolution: Fixed
Issue resolved by pull request 27537
[https://github.com/apache/spark/pull/27537]
> to_timestamp failed to parse 2020-01-27T20:06:11.847-0800 using pattern "yyyy-MM-dd'T'HH:mm:ss.SSSz"
> ----------------------------------------------------------------------------------------------------
>
> Key: SPARK-30668
> URL: https://issues.apache.org/jira/browse/SPARK-30668
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Xiao Li
> Assignee: Maxim Gekk
> Priority: Blocker
> Fix For: 3.0.0
>
>
> {code:java}
> SELECT to_timestamp("2020-01-27T20:06:11.847-0800", "yyyy-MM-dd'T'HH:mm:ss.SSSz")
> {code}
> This can return a valid value in Spark 2.4 but return NULL in the latest master
> **2.4.5 RC2**
> {code}
> scala> sql("""SELECT to_timestamp("2020-01-27T20:06:11.847-0800", "yyyy-MM-dd'T'HH:mm:ss.SSSz")""").show
> +----------------------------------------------------------------------------+
> |to_timestamp('2020-01-27T20:06:11.847-0800', 'yyyy-MM-dd\'T\'HH:mm:ss.SSSz')|
> +----------------------------------------------------------------------------+
> | 2020-01-27 20:06:11|
> +----------------------------------------------------------------------------+
> {code}
> **2.2.3 ~ 2.4.4** (2.0.2 ~ 2.1.3 doesn't have `to_timestamp`).
> {code}
> spark-sql> SELECT to_timestamp("2020-01-27T20:06:11.847-0800", "yyyy-MM-dd'T'HH:mm:ss.SSSz");
> 2020-01-27 20:06:11
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org