You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "JinxinTang (Jira)" <ji...@apache.org> on 2020/06/16 03:02:00 UTC

[jira] [Comment Edited] (SPARK-31980) Spark sequence() fails if start and end of range are identical dates

    [ https://issues.apache.org/jira/browse/SPARK-31980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17136248#comment-17136248 ] 

JinxinTang edited comment on SPARK-31980 at 6/16/20, 3:01 AM:
--------------------------------------------------------------

[~DaveDeCaprio] Maybe one of these PRs could rebase to solve this code section conflict.


was (Author: jinxintang):
[~DaveDeCaprio] Maybe one of this PRs could rebase to solve this code section conflict.

> Spark sequence() fails if start and end of range are identical dates
> --------------------------------------------------------------------
>
>                 Key: SPARK-31980
>                 URL: https://issues.apache.org/jira/browse/SPARK-31980
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.4
>         Environment: Spark 2.4.4 standalone and on AWS EMR
>            Reporter: Dave DeCaprio
>            Priority: Minor
>
>  
> The following Spark SQL query throws an exception
> {code:java}
> select sequence(cast("2011-03-01" as date), cast("2011-03-01" as date), interval 1 month)
> {code}
> The error is:
>  
>  
> {noformat}
> java.lang.ArrayIndexOutOfBoundsException: 1java.lang.ArrayIndexOutOfBoundsException: 1 at scala.runtime.ScalaRunTime$.array_update(ScalaRunTime.scala:92) at org.apache.spark.sql.catalyst.expressions.Sequence$TemporalSequenceImpl.eval(collectionOperations.scala:2681) at org.apache.spark.sql.catalyst.expressions.Sequence.eval(collectionOperations.scala:2514) at org.apache.spark.sql.catalyst.expressions.UnaryExpression.eval(Expression.scala:389){noformat}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org