You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kent Yao (Jira)" <ji...@apache.org> on 2019/11/28 12:30:00 UTC

[jira] [Created] (SPARK-30070) ANSI-F053: Overlaps datetimes predicate support

Kent Yao created SPARK-30070:
--------------------------------

             Summary: ANSI-F053: Overlaps datetimes predicate support
                 Key: SPARK-30070
                 URL: https://issues.apache.org/jira/browse/SPARK-30070
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Kent Yao



{code:java}
 * The operator `OVERLAPS` determines whether or not two chronological periods overlap in time. A
 * chronological period is specified by a pair of datetimes (starting and ending).
 *
 * If the length of the period is greater than 0, then the period consists of all points of time
 * greater than or equal to the lower endpoint, and less than the upper endpoint,
 * a.k.a [lower, upper).
 *
 * If the length of the period is equal to 0, then the period consists of a single point in time,
 * the lower endpoint, a.k.a [lower, lower].
 *
 * Two periods overlap if they have at least one point in common.
{code}

PostgreSQL




{code:sql}
postgres=# select (cast(a as timestamp), cast(b as timestamp)) overlaps (cast(c as timestamp), cast(d as timestamp)) from (values
 ('2011-11-11', '2011-11-11', '2011-11-11', '2011-11-11'),
 ('2011-11-10', '2011-11-11', '2011-11-11', '2011-11-12'),
 ('2011-11-11', '2011-11-10', '2011-11-11', '2011-11-12'),
 ('2011-11-11', '2011-11-10', '2011-11-12', '2011-11-11'),
 ('2011-11-10', '2011-11-11', '2011-11-12', '2011-11-13'),
 ('2011-11-10', '2011-11-20', '2011-11-11', '2011-11-19'),
 ('2011-11-11', '2011-11-19', '2011-11-10', '2011-11-20')) t(a,b,c,d);
 overlaps
----------
 t
 f
 f
 f
 f
 t
 t
(7 rows)
{code}




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org