You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dylan Guedes (JIRA)" <ji...@apache.org> on 2019/08/02 12:03:00 UTC

[jira] [Created] (SPARK-28602) Recognize interval as a numeric type

Dylan Guedes created SPARK-28602:
------------------------------------

             Summary: Recognize interval as a numeric type
                 Key: SPARK-28602
                 URL: https://issues.apache.org/jira/browse/SPARK-28602
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Dylan Guedes


Hello,
Spark does not recognize `interval` type as a `numeric` one, which means that we can't use `interval` columns in aggregated functions. For instance, the following query works on PgSQL but does not work on Spark:
{code:sql}SELECT i,AVG(cast(v as interval)) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) FROM (VALUES(1,'1 sec'),(2,'2 sec'),(3,NULL),(4,NULL)) t(i,v);{code}

{code:sql}cannot resolve 'avg(CAST(`v` AS INTERVAL))' due to data type mismatch: function average requires numeric types, not interval; line 1 pos 9{code}
 



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org