You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "JinxinTang (Jira)" <ji...@apache.org> on 2020/06/29 23:13:00 UTC

[jira] [Created] (SPARK-32133) Forbid time field steps for date start/end

JinxinTang created SPARK-32133:
----------------------------------

             Summary: Forbid time field steps for date start/end
                 Key: SPARK-32133
                 URL: https://issues.apache.org/jira/browse/SPARK-32133
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: JinxinTang
             Fix For: 3.1.0


*Sequence time field steps for date start/end looks strange in spark as follows:*

scala> sql("select explode(sequence(cast('2011-03-01' as date), cast('2011-05-01' as date), interval 1 second))").head(3)

res26: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-01], [2011-03-01])

scala> sql("select explode(sequence(cast('2011-03-01' as date), cast('2011-05-01' as date), interval 1 minute))").head(3)
res27: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-01], [2011-03-01])

scala> sql("select explode(sequence(cast('2011-03-01' as date), cast('2011-05-01' as date), interval 1 hour))").head(3)
res28: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-01], [2011-03-01])

*While this behavior in Prosto make sense:* 

presto> select sequence(date('2011-03-01'),date('2011-03-02'),interval '1' hour);
Query 20200624_122744_00002_pehix failed: sequence step must be a day interval if start and end values are dates
presto> select sequence(date('2011-03-01'),date('2011-03-02'),interval '1' day);
_col0
[2011-03-01, 2011-03-02]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org