You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kris Mok (JIRA)" <ji...@apache.org> on 2019/01/18 17:04:00 UTC

[jira] [Created] (SPARK-26664) Make DecimalType's minimum adjusted scale configurable

Kris Mok created SPARK-26664:
--------------------------------

             Summary: Make DecimalType's minimum adjusted scale configurable
                 Key: SPARK-26664
                 URL: https://issues.apache.org/jira/browse/SPARK-26664
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Kris Mok


Introduce a new conf flag that allows the user to set the value of {{DecimalType.MINIMAL_ADJUSTED_SCALE}}, currently a constant of 6, to match their workloads' needs.

The new flag will be {{spark.sql.decimalOperations.minimumAdjustedScale}}.

SPARK-22036 introduced a new conf flag {{spark.sql.decimalOperations.allowPrecisionLoss}} to match SQL Server's and new Hive's behavior of allowing precision loss when performing multiplication/division on big and small decimal numbers.
Along with this feature, a fixed {{MINIMAL_ADJUSTED_SCALE}} was set to 6 for when precision loss is allowed.

Some customer workload may needed a larger adjusted scale to match their business needs, and in exchange they may be willing to tolerate some more calculations overflowing the max precision, leading to nulls. So they would like the minimum adjusted scale to be configurable. Thus the need for a new conf.

The default behavior after introducing this conf flag is not changed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org