You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rene Treffer (JIRA)" <ji...@apache.org> on 2015/06/14 08:44:00 UTC

[jira] [Created] (SPARK-8359) Spark SQL Decimal type precision loss on multiplication

Rene Treffer created SPARK-8359:
-----------------------------------

             Summary: Spark SQL Decimal type precision loss on multiplication
                 Key: SPARK-8359
                 URL: https://issues.apache.org/jira/browse/SPARK-8359
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.5.0
            Reporter: Rene Treffer


It looks like the precision of decimal can not be raised beyond ~2^112 without causing full value truncation.

The following code computes the power of two up to a specific point
{code}
import org.apache.spark.sql.types.Decimal

val one = Decimal(1)
val two = Decimal(2)

def pow(n : Int) :  Decimal = if (n <= 0) { one } else { 
  val a = pow(n - 1)
  a.changePrecision(n,0)
  two.changePrecision(n,0)
  a * two
}

(109 to 120).foreach(n => println(pow(n).toJavaBigDecimal.unscaledValue.toString))
649037107316853453566312041152512
1298074214633706907132624082305024
2596148429267413814265248164610048
5192296858534827628530496329220096
1038459371706965525706099265844019
2076918743413931051412198531688038
4153837486827862102824397063376076
8307674973655724205648794126752152
1661534994731144841129758825350430
3323069989462289682259517650700860
6646139978924579364519035301401720
1329227995784915872903807060280344
{code}
Beyond ~2^112 the precision is truncated even if the precision was set to n and should thus handle 10^n without problems..



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org