You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Renkai Ge (JIRA)" <ji...@apache.org> on 2019/02/13 11:24:00 UTC
[jira] [Commented] (SPARK-24778) DateTimeUtils.getTimeZone method
returns GMT time if timezone cannot be parsed
[ https://issues.apache.org/jira/browse/SPARK-24778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16767068#comment-16767068 ]
Renkai Ge commented on SPARK-24778:
-----------------------------------
{{Since Spark 2.3 dropped support for java versions before Java 8.We can use classes in java.time.* instead of }}relevant ones in java.util.*.
I will to work on this issue since I encountered the same problem in my application.
> DateTimeUtils.getTimeZone method returns GMT time if timezone cannot be parsed
> ------------------------------------------------------------------------------
>
> Key: SPARK-24778
> URL: https://issues.apache.org/jira/browse/SPARK-24778
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: Vinitha Reddy Gankidi
> Priority: Major
>
> {{DateTimeUtils.getTimeZone}} calls java's {{Timezone.getTimezone}} method that defaults to GMT if the timezone cannot be parsed. This can be misleading for users and its better to return NULL instead of returning an incorrect value.
> To reproduce: {{from_utc_timestamp}} is one of the functions that calls {{DateTimeUtils.getTimeZone}}. Session timezone is GMT for the following queries.
> {code:java}
> SELECT from_utc_timestamp('2018-07-10 12:00:00', 'GMT+05:00') -> 2018-07-10 17:00:00
> SELECT from_utc_timestamp('2018-07-10 12:00:00', '+05:00') -> 2018-07-10 12:00:00 (Defaults to GMT as the timezone is not recognized){code}
> We could fix it by using the workaround mentioned here: [https://bugs.openjdk.java.net/browse/JDK-4412864].
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org