You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/07/10 18:08:00 UTC

[jira] [Assigned] (SPARK-32133) Forbid time field steps for date start/end in Sequence

     [ https://issues.apache.org/jira/browse/SPARK-32133?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun reassigned SPARK-32133:
-------------------------------------

    Assignee: JinxinTang

> Forbid time field steps for date start/end in Sequence
> ------------------------------------------------------
>
>                 Key: SPARK-32133
>                 URL: https://issues.apache.org/jira/browse/SPARK-32133
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: JinxinTang
>            Assignee: JinxinTang
>            Priority: Major
>             Fix For: 3.1.0
>
>
> *Sequence time field steps for date start/end looks strange in spark as follows:*
> scala> sql("select explode(sequence(cast('2011-03-01' as date), cast('2011-03-02' as date), interval 1 hour))").head(3)
> res0: Array[org.apache.spark.sql.Row] = _Array([2011-03-01], [2011-03-01], [2011-03-01])_ *<- strange result.*
> scala> sql("select explode(sequence(cast('2011-03-01' as date), cast('2011-03-02' as date), interval 1 day))").head(3)
> res1: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-02])
> *While this behavior in Prosto make sense:* 
> presto> select sequence(date('2011-03-01'),date('2011-03-02'),interval '1' hour);
>  Query 20200624_122744_00002_pehix failed: sequence step must be a day interval if start and end values are dates
>  presto> select sequence(date('2011-03-01'),date('2011-03-02'),interval '1' day);
>  _col0
>  [2011-03-01, 2011-03-02]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org