You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2018/09/27 01:03:00 UTC

[jira] [Reopened] (SPARK-25454) Division between operands with negative scale can cause precision loss

     [ https://issues.apache.org/jira/browse/SPARK-25454?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan reopened SPARK-25454:
---------------------------------
      Assignee:     (was: Wenchen Fan)

I'm reopening it, since the bug is not fully fixed. But we do have a workaround now: setting {{spark.sql.legacy.literal.pickMinimumPrecision}} to false.

> Division between operands with negative scale can cause precision loss
> ----------------------------------------------------------------------
>
>                 Key: SPARK-25454
>                 URL: https://issues.apache.org/jira/browse/SPARK-25454
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.3.1
>            Reporter: Marco Gaido
>            Priority: Major
>
> The issue was originally reported by [~bersprockets] here: https://issues.apache.org/jira/browse/SPARK-22036?focusedCommentId=16618104&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16618104.
> The problem consist in a precision loss when the second operand of the division is a decimal with a negative scale. It was present also before 2.3 but it was harder to reproduce: you had to do something like {{lit(BigDecimal(100e6))}}, while now this can happen more frequently with SQL constants.
> The problem is that our logic is taken from Hive and SQLServer where decimals with negative scales are not allowed. We might also consider enforcing this too in 3.0 eventually. Meanwhile we can fix the logic for computing the result type for a division.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org