You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bill Chambers (JIRA)" <ji...@apache.org> on 2016/11/16 04:33:58 UTC
[jira] [Updated] (SPARK-18424) Improve Date Parsing Semantics &
Functionality
[ https://issues.apache.org/jira/browse/SPARK-18424?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Bill Chambers updated SPARK-18424:
----------------------------------
Summary: Improve Date Parsing Semantics & Functionality (was: Improve Date Parsing Functionality)
> Improve Date Parsing Semantics & Functionality
> ----------------------------------------------
>
> Key: SPARK-18424
> URL: https://issues.apache.org/jira/browse/SPARK-18424
> Project: Spark
> Issue Type: Improvement
> Reporter: Bill Chambers
> Assignee: Bill Chambers
> Priority: Minor
>
> I've found it quite cumbersome to work with dates thus far in Spark, it can be hard to reason about the timeformat and what type you're working with, for instance:
> say that I have a date in the format
> {code}
> 2017-20-12
> // Y-D-M
> {code}
> In order to parse that into a Date, I have to perform several conversions.
> {code}
> to_date(
> unix_timestamp(col("date"), dateFormat)
> .cast("timestamp"))
> .alias("date")
> {code}
> I propose simplifying this by adding a to_date function (exists) but adding one that accepts a format for that date. I also propose a to_timestamp function that also supports a format.
> so that you can avoid entirely the above conversion.
> It's also worth mentioning that many other databases support this. For instance, mysql has the STR_TO_DATE function, netezza supports the to_timestamp semantic.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org