You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Peter Toth (JIRA)" <ji...@apache.org> on 2019/05/30 13:59:00 UTC

[jira] [Created] (SPARK-27882) Support SQL:2016 compatible datetime patterns

Peter Toth created SPARK-27882:
----------------------------------

             Summary: Support SQL:2016 compatible datetime patterns
                 Key: SPARK-27882
                 URL: https://issues.apache.org/jira/browse/SPARK-27882
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Peter Toth


Date and time related functions in Spark use [DateTimeFormatter|https://docs.oracle.com/javase/8/docs/api/java/time/format/DateTimeFormatter.html] to parse and format dates. Unfortunately, it is not entirely compatible with the SQL:2016 standard.

There is an initiative in Impala: IMPALA-4018 and Hive HIVE-21575 to support SQL:2016 patterns as both these systems lack this feature currently.
The [Impala design document|https://docs.google.com/document/d/1V7k6-lrPGW7_uhqM-FhKl3QsxwCRy69v2KIxPsGjc1k/edit] and [compatibility matrix|https://docs.google.com/spreadsheets/d/1lD4EcVaTp-qtm0JYPTPfhqYrwwNl0qJHHYP8dICS9EY/edit#gid=0] created by [~gaborkaszab] covers how the compliance with the standard can be achieved and how do other databases support this standard. I believe these documents are very useful for Spark community as well.
Since Hive is also JVM based I also see an option to share some code that does the parsing/formatting according to the standard. 

(Please note that format patterns are just one part of these tickets, the other part about `CAST (... FORMAT <pattern>)` function, which is also part of the SQL:2016 standard, is covered here: SPARK-27881)





--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org