You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Juliusz Sompolski (Jira)" <ji...@apache.org> on 2020/06/18 11:27:00 UTC

[jira] [Updated] (SPARK-32021) make_interval does not accept seconds >100

     [ https://issues.apache.org/jira/browse/SPARK-32021?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Juliusz Sompolski updated SPARK-32021:
--------------------------------------
    Description: 
In make_interval(years, months, weeks, days, hours, mins, secs), secs are defined as Decimal(8, 6), which turns into null if the value of the expression overflows 100 seconds.
Larger seconds values should be allowed.

This has been reported by Simba, who wants to use make_interval to implement translation for TIMESTAMP_ADD ODBC function in Spark 3.0.
ODBC {fn TIMESTAMPADD(SECOND, integer_exp, timestamp} fails when integer_exp returns seconds values >= 100.

  was:
In make_interval(years, months, weeks, days, hours, mins, secs), secs are defined as Decimal(8, 6), which turns into null if the value of the expression overflows 100 seconds.
Larger seconds values should be allowed.

This has been reported by Simba, who wants to use make_interval to implement translation for TIMESTAMP_ADD ODBC function in Spark 3.0.


> make_interval does not accept seconds >100
> ------------------------------------------
>
>                 Key: SPARK-32021
>                 URL: https://issues.apache.org/jira/browse/SPARK-32021
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Juliusz Sompolski
>            Priority: Major
>
> In make_interval(years, months, weeks, days, hours, mins, secs), secs are defined as Decimal(8, 6), which turns into null if the value of the expression overflows 100 seconds.
> Larger seconds values should be allowed.
> This has been reported by Simba, who wants to use make_interval to implement translation for TIMESTAMP_ADD ODBC function in Spark 3.0.
> ODBC {fn TIMESTAMPADD(SECOND, integer_exp, timestamp} fails when integer_exp returns seconds values >= 100.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org