You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2015/07/15 00:53:04 UTC

[jira] [Commented] (SPARK-9046) Decimal type support improvement

    [ https://issues.apache.org/jira/browse/SPARK-9046?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14627222#comment-14627222 ] 

Yin Huai commented on SPARK-9046:
---------------------------------

[~viirya] [~rtreffer] [~davies] I am thinking if we should just revert SPARK-8359, SPARK-8800, and SPARK-8677. My reason is that Java does not really support unlimited precision decimal values and we are not making our decimal support better with these changes (SPARK-8800 was caused by the fix of SPARK-8359. SPARK-8677 was caused by the fix of SPARK-8800. And, the fix of SPARK-8677 causes SPARK-8800 again). Considering the problems introduced after we merged the patch of SPARK-8359 and potential issues in later, I believe the right thing to do is to figure out what is the maximum precision that we want to support. Also, when we do operations between two decimal types, we should use our precision/scale matrix to figure out the right precision/scale to use. 

> Decimal type support improvement
> --------------------------------
>
>                 Key: SPARK-9046
>                 URL: https://issues.apache.org/jira/browse/SPARK-9046
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Yin Huai
>            Priority: Critical
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org