You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daeho Ro (Jira)" <ji...@apache.org> on 2020/09/09 01:52:00 UTC

[jira] [Commented] (SPARK-27089) Loss of precision during decimal division

    [ https://issues.apache.org/jira/browse/SPARK-27089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17192566#comment-17192566 ] 

Daeho Ro commented on SPARK-27089:
----------------------------------

It seems that the bug persists on the spark version 3.0.0

> Loss of precision during decimal division
> -----------------------------------------
>
>                 Key: SPARK-27089
>                 URL: https://issues.apache.org/jira/browse/SPARK-27089
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.0, 2.4.0
>            Reporter: ylo0ztlmtusq
>            Priority: Major
>
> Spark looses decimal places when dividing decimal numbers.
>  
> Expected behavior (In Spark 2.2.3 or before)
>  
> {code:java}
> scala> val sql = """select cast(cast(3 as decimal(38,14)) / cast(9 as decimal(38,14)) as decimal(38,14)) val"""
> sql: String = select cast(cast(3 as decimal(38,14)) / cast(9 as decimal(38,14)) as decimal(38,14)) val
> scala> spark.sql(sql).show
> 19/03/07 21:23:51 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
> +----------------+
> |             val|
> +----------------+
> |0.33333333333333|
> +----------------+
> {code}
>  
> Current behavior (In Spark 2.3.2 and later)
>  
> {code:java}
> scala> val sql = """select cast(cast(3 as decimal(38,14)) / cast(9 as decimal(38,14)) as decimal(38,14)) val"""
> sql: String = select cast(cast(3 as decimal(38,14)) / cast(9 as decimal(38,14)) as decimal(38,14)) val
> scala> spark.sql(sql).show
> +----------------+
> |             val|
> +----------------+
> |0.33333300000000|
> +----------------+
> {code}
>  
> Seems to caused by {{promote_precision(38, 6) }}
>  
> {code:java}
> scala> spark.sql(sql).explain(true)
> == Parsed Logical Plan ==
> Project [cast((cast(3 as decimal(38,14)) / cast(9 as decimal(38,14))) as decimal(38,14)) AS val#20]
> +- OneRowRelation
> == Analyzed Logical Plan ==
> val: decimal(38,14)
> Project [cast(CheckOverflow((promote_precision(cast(cast(3 as decimal(38,14)) as decimal(38,14))) / promote_precision(cast(cast(9 as decimal(38,14)) as decimal(38,14)))), DecimalType(38,6)) as decimal(38,14)) AS val#20]
> +- OneRowRelation
> == Optimized Logical Plan ==
> Project [0.33333300000000 AS val#20]
> +- OneRowRelation
> == Physical Plan ==
> *(1) Project [0.33333300000000 AS val#20]
> +- Scan OneRowRelation[]
> {code}
>  
> Source https://stackoverflow.com/q/55046492



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org