You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Navya Krishnappa (JIRA)" <ji...@apache.org> on 2017/09/15 06:13:00 UTC
[jira] [Created] (SPARK-22020) Support session local timezone
Navya Krishnappa created SPARK-22020:
----------------------------------------
Summary: Support session local timezone
Key: SPARK-22020
URL: https://issues.apache.org/jira/browse/SPARK-22020
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 2.2.0
Reporter: Navya Krishnappa
As of Spark 2.1, Spark SQL assumes the machine timezone for datetime manipulation, which is bad if users are not in the same timezones as the machines, or if different users have different timezones.
Input data:
Date,SparkDate,SparkDate1,SparkDate2
04/22/2017T03:30:02,2017-03-21T03:30:02,2017-03-21T03:30:02.02Z,2017-03-21T00:00:00Z
I have set the below value to set the timeZone to UTC. It is adding the current timeZone value even though it is in the UTC format.
spark.conf.set("spark.sql.session.timeZone", "UTC")
Expected : Time should remain same as the input since it's already in UTC format
var df1 = spark.read.option("delimiter", ",").option("qualifier", "\"").option("inferSchema","true").option("header", "true").option("mode", "PERMISSIVE").option("timestampFormat","MM/dd/yyyy'T'HH:mm:ss.SSS").option("dateFormat", "MM/dd/yyyy'T'HH:mm:ss").csv("DateSpark.csv");
df1: org.apache.spark.sql.DataFrame = [Name: string, Age: int ... 5 more fields]
scala> df1.show(false);
------------------------------------------------------------------------------
Name Age Add Date SparkDate SparkDate1 SparkDate2
------------------------------------------------------------------------------
abc 21 bvxc 04/22/2017T03:30:02 2017-03-21 03:30:02 2017-03-21 09:00:02.02 2017-03-21 05:30:00
------------------------------------------------------------------------------
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org