You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2019/09/09 05:52:00 UTC

[jira] [Resolved] (SPARK-29000) [SQL] Decimal precision overflow when don't allow precision loss

     [ https://issues.apache.org/jira/browse/SPARK-29000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-29000.
---------------------------------
    Fix Version/s: 3.0.0
       Resolution: Fixed

Issue resolved by pull request 25701
[https://github.com/apache/spark/pull/25701]

> [SQL] Decimal precision overflow when don't allow precision loss
> ----------------------------------------------------------------
>
>                 Key: SPARK-29000
>                 URL: https://issues.apache.org/jira/browse/SPARK-29000
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.4
>            Reporter: feiwang
>            Priority: Major
>             Fix For: 3.0.0
>
>         Attachments: screenshot-1.png
>
>
> When we set spark.sql.decimalOperations.allowPrecisionLoss=false.
> For the sql below, the result will overflow and return null.
> {code:java}
> // Some comments here
> select case when 1=2 then 1 else 100.000000000000000000000000 end * 1
> {code}
> However, this sql will return correct result.
> {code:java}
> // Some comments here
> select case when 1=2 then 1 else 100.000000000000000000000000 end * 1.0
> {code}
> The reason is that, there are some issues for the binaryOperator between nonDecimal and decimal.
> In fact, there is a nondecimalAndDecimal method in DecimalPrecision class.
> I copy its implementation into the body of ImplicitTypeCasts.coerceTypes() method.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org