You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bill Schneider (Jira)" <ji...@apache.org> on 2020/09/03 17:24:00 UTC

[jira] [Updated] (SPARK-28955) Support for LocalDateTime semantics

     [ https://issues.apache.org/jira/browse/SPARK-28955?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Bill Schneider updated SPARK-28955:
-----------------------------------
    Issue Type: New Feature  (was: Wish)

Changing to new feature; this is a recurring issue when dealing with different Spark jobs running in different timezones when we want the time to remain fixed regardless of time zone.  (e.g., local time semantics) 

> Support for LocalDateTime semantics
> -----------------------------------
>
>                 Key: SPARK-28955
>                 URL: https://issues.apache.org/jira/browse/SPARK-28955
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Bill Schneider
>            Priority: Major
>
> It would be great if Spark supported local times in DataFrames, rather than only instants. 
> The specific use case I have in mind is something like
>  * parse "2019-01-01 17:00" (no timezone) from CSV -> LocalDateTime in dataframe
>  * save to Parquet: LocalDateTime is stored with same integer value as 2019-01-01 17:00 UTC, but with isAdjustedToUTC=false.  (Currently Spark saves either INT96 or TIME_MILLIS/TIME_MICROS which has isAdjustedToUTC=true)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org