You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/06/27 03:22:00 UTC
[jira] [Commented] (SPARK-23927) High-order function: sequence
[ https://issues.apache.org/jira/browse/SPARK-23927?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16524519#comment-16524519 ]
Apache Spark commented on SPARK-23927:
--------------------------------------
User 'ueshin' has created a pull request for this issue:
https://github.com/apache/spark/pull/21646
> High-order function: sequence
> -----------------------------
>
> Key: SPARK-23927
> URL: https://issues.apache.org/jira/browse/SPARK-23927
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 2.3.0
> Reporter: Xiao Li
> Assignee: Alex Vayda
> Priority: Major
> Fix For: 2.4.0
>
>
> Ref: https://prestodb.io/docs/current/functions/array.html
> * sequence(start, stop) → array<bigint>
> Generate a sequence of integers from start to stop, incrementing by 1 if start is less than or equal to stop, otherwise -1.
> * sequence(start, stop, step) → array<bigint>
> Generate a sequence of integers from start to stop, incrementing by step.
> * sequence(start, stop) → array<date>
> Generate a sequence of dates from start date to stop date, incrementing by 1 day if start date is less than or equal to stop date, otherwise -1 day.
> * sequence(start, stop, step) → array<date>
> Generate a sequence of dates from start to stop, incrementing by step. The type of step can be either INTERVAL DAY TO SECOND or INTERVAL YEAR TO MONTH.
> * sequence(start, stop, step) → array<timestamp>
> Generate a sequence of timestamps from start to stop, incrementing by step. The type of step can be either INTERVAL DAY TO SECOND or INTERVAL YEAR TO MONTH.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org