You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maxim Gekk (JIRA)" <ji...@apache.org> on 2019/08/11 13:52:00 UTC
[jira] [Updated] (SPARK-28687) Support `epoch`, `isoyear`,
`milliseconds` and `microseconds` at `extract()`
[ https://issues.apache.org/jira/browse/SPARK-28687?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Maxim Gekk updated SPARK-28687:
-------------------------------
Description:
Currently, we support these field for EXTRACT: CENTURY, MILLENNIUM, DECADE, YEAR, QUARTER, MONTH, WEEK, DAY, DAYOFWEEK, HOUR, MINUTE, SECOND, DOW, ISODOW, DOY,
We also need support: EPOCH, MICROSECONDS, MILLISECONDS, TIMEZONE, TIMEZONE_M, TIMEZONE_H, ISOYEAR.
https://www.postgresql.org/docs/11/functions-datetime.html#FUNCTIONS-DATETIME-EXTRACT
was:
Currently, we support these field for EXTRACT: YEAR, QUARTER, MONTH, WEEK, DAY, DAYOFWEEK, HOUR, MINUTE, SECOND.
We also need support: EPOCH, CENTURY, MILLENNIUM, DECADE, MICROSECONDS, MILLISECONDS, DOW, ISODOW, DOY, TIMEZONE, TIMEZONE_M, TIMEZONE_H, JULIAN, ISOYEAR.
https://www.postgresql.org/docs/11/functions-datetime.html#FUNCTIONS-DATETIME-EXTRACT
> Support `epoch`, `isoyear`, `milliseconds` and `microseconds` at `extract()`
> ----------------------------------------------------------------------------
>
> Key: SPARK-28687
> URL: https://issues.apache.org/jira/browse/SPARK-28687
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Maxim Gekk
> Assignee: Maxim Gekk
> Priority: Major
> Fix For: 3.0.0
>
>
> Currently, we support these field for EXTRACT: CENTURY, MILLENNIUM, DECADE, YEAR, QUARTER, MONTH, WEEK, DAY, DAYOFWEEK, HOUR, MINUTE, SECOND, DOW, ISODOW, DOY,
> We also need support: EPOCH, MICROSECONDS, MILLISECONDS, TIMEZONE, TIMEZONE_M, TIMEZONE_H, ISOYEAR.
> https://www.postgresql.org/docs/11/functions-datetime.html#FUNCTIONS-DATETIME-EXTRACT
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org