You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2019/03/26 22:31:00 UTC
[jira] [Resolved] (SPARK-27242) Avoid using default time zone in
formatting TIMESTAMP/DATE literals
[ https://issues.apache.org/jira/browse/SPARK-27242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-27242.
---------------------------------
Resolution: Fixed
Fix Version/s: 3.0.0
Issue resolved by pull request 24181
[https://github.com/apache/spark/pull/24181]
> Avoid using default time zone in formatting TIMESTAMP/DATE literals
> -------------------------------------------------------------------
>
> Key: SPARK-27242
> URL: https://issues.apache.org/jira/browse/SPARK-27242
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: Maxim Gekk
> Assignee: Maxim Gekk
> Priority: Minor
> Fix For: 3.0.0
>
>
> Spark calls the toString() methods of java.sql.Timestamp/java.sql.Date in formatting TIMESTAMP/DATE literals in Literal.sql: https://github.com/apache/spark/blob/0f4f8160e6d01d2e263adcf39d53bd0a03fc1b73/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala#L373-L374 . This is inconsistent to parsing TIMESTAMP/DATE literals in AstBuilder: https://github.com/apache/spark/blob/a529be2930b1d69015f1ac8f85e590f197cf53cf/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala#L1594-L1597 where *spark.sql.session.timeZone* is used in parsing TIMESTAMP literals, and DATE literals are parsed independently from time zone (actually in UTC time zone). The ticket aims to make parsing and formatting date/timestamp literals consistent, and use the SQL config for TIMESTAMP literals.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org