You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vinod KC (JIRA)" <ji...@apache.org> on 2018/08/21 14:57:00 UTC

[jira] [Created] (SPARK-25177) When dataframe decimal type column having a scale higher than 6, 0 values are shown in scientific notation

Vinod KC created SPARK-25177:
--------------------------------

             Summary: When dataframe decimal type column having a scale higher than 6, 0 values are shown in scientific notation
                 Key: SPARK-25177
                 URL: https://issues.apache.org/jira/browse/SPARK-25177
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.4.0
            Reporter: Vinod KC


If scale of decimal type is > 6 , 0 value will be shown in scientific notation and hence, when the dataframe output is saved to external database, it fails due to scientific notation on "0" values.

Eg: In Spark
 --------------
spark.sql("create table test (a decimal(10,7), b decimal(10,6), c decimal(10,8))")
spark.sql("insert into test values(0, 0,0)")
spark.sql("insert into test values(1, 1, 1)")
spark.table("test").show()
+-------------+----------+----------------+
|         a      |           b  |               c  |
+-------------+-----------+---------------+
|       0E-7  |0.000000 |         0E-8  | //If scale > 6, zero is displayed in scientific notation
|1.0000000|1.000000|1.00000000|
+-------------+------------+--------------+

 Eg: In Postgress
--------------
CREATE TABLE Testdec (a DECIMAL(10,7), b DECIMAL(10,6), c DECIMAL(10,8));
INSERT INTO Testdec VALUES (0,0,0);
INSERT INTO Testdec VALUES (1,1,1);
select * from Testdec;
Result:
          a | .          b |        c
-----------+-----------+----------
 0.0000000 | 0.000000 | 0.00000000
 1.0000000 | 1.000000 | 1.00000000

We can make spark SQL result consistent with other Databases like Postgresql

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org