You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/08/06 02:10:00 UTC

[jira] [Assigned] (SPARK-36431) Support comparison of ANSI intervals with different fields

     [ https://issues.apache.org/jira/browse/SPARK-36431?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-36431:
------------------------------------

    Assignee: Apache Spark

> Support comparison of ANSI intervals with different fields
> ----------------------------------------------------------
>
>                 Key: SPARK-36431
>                 URL: https://issues.apache.org/jira/browse/SPARK-36431
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Max Gekk
>            Assignee: Apache Spark
>            Priority: Major
>
> Support comparison of
> - a day-time interval with another day-time interval which has different fields
> - a year-month interval with another year-month interval where fields are different.
> The example below shows the issue:
> {code:sql}
> spark-sql> select interval '1' day > interval '1' hour;
> Error in query: cannot resolve '(INTERVAL '1' DAY > INTERVAL '01' HOUR)' due to data type mismatch: differing types in '(INTERVAL '1' DAY > INTERVAL '01' HOUR)' (interval day and interval hour).; line 1 pos 7;
> 'Project [unresolvedalias((INTERVAL '1' DAY > INTERVAL '01' HOUR), None)]
> +- OneRowRelation
> spark-sql> select interval '2' year > interval '11' month;
> Error in query: cannot resolve '(INTERVAL '2' YEAR > INTERVAL '11' MONTH)' due to data type mismatch: differing types in '(INTERVAL '2' YEAR > INTERVAL '11' MONTH)' (interval year and interval month).; line 1 pos 7;
> 'Project [unresolvedalias((INTERVAL '2' YEAR > INTERVAL '11' MONTH), None)]
> +- OneRowRelation
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org