You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Max Gekk (Jira)" <ji...@apache.org> on 2021/07/20 06:57:00 UTC

[jira] [Commented] (SPARK-36222) Step by days in the Sequence expression for dates

    [ https://issues.apache.org/jira/browse/SPARK-36222?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17383818#comment-17383818 ] 

Max Gekk commented on SPARK-36222:
----------------------------------

[~beliefer] Please, leave a comment if you would like to work on this.

> Step by days in the Sequence expression for dates
> -------------------------------------------------
>
>                 Key: SPARK-36222
>                 URL: https://issues.apache.org/jira/browse/SPARK-36222
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Max Gekk
>            Priority: Major
>
> Allow to generate a sequence of dates by day step in a range of dates. For instance:
> {code:sql}
> spark-sql> select sequence(date'2021-07-01', date'2021-07-10', interval '3' day);
> Error in query: cannot resolve 'sequence(DATE '2021-07-01', DATE '2021-07-10', INTERVAL '3' DAY)' due to data type mismatch:
> sequence uses the wrong parameter type. The parameter type must conform to:
> 1. The start and stop expressions must resolve to the same type.
> 2. If start and stop expressions resolve to the 'date' or 'timestamp' type
> then the step expression must resolve to the 'interval' or
> 'interval year to month' or 'interval day to second' type,
> otherwise to the same type as the start and stop expressions.
>          ; line 1 pos 7;
> 'Project [unresolvedalias(sequence(2021-07-01, 2021-07-10, Some(INTERVAL '3' DAY), Some(Europe/Moscow)), None)]
> +- OneRowRelation
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org