You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/03/16 22:51:06 UTC

[jira] [Updated] (SPARK-26664) Make DecimalType's minimum adjusted scale configurable

     [ https://issues.apache.org/jira/browse/SPARK-26664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-26664:
----------------------------------
    Affects Version/s:     (was: 3.0.0)
                       3.1.0

> Make DecimalType's minimum adjusted scale configurable
> ------------------------------------------------------
>
>                 Key: SPARK-26664
>                 URL: https://issues.apache.org/jira/browse/SPARK-26664
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Kris Mok
>            Priority: Minor
>
> Introduce a new conf flag that allows the user to set the value of {{DecimalType.MINIMAL_ADJUSTED_SCALE}}, currently a constant of 6, to match their workloads' needs.
> The new flag will be {{spark.sql.decimalOperations.minimumAdjustedScale}}.
> SPARK-22036 introduced a new conf flag {{spark.sql.decimalOperations.allowPrecisionLoss}} to match SQL Server's and new Hive's behavior of allowing precision loss when performing multiplication/division on big and small decimal numbers.
> Along with this feature, a fixed {{MINIMAL_ADJUSTED_SCALE}} was set to 6 for when precision loss is allowed.
> Some customer workload may needed a larger adjusted scale to match their business needs, and in exchange they may be willing to tolerate some more calculations overflowing the max precision, leading to nulls. So they would like the minimum adjusted scale to be configurable. Thus the need for a new conf.
> The default behavior after introducing this conf flag is not changed.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org