You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/06/27 12:46:04 UTC

[jira] [Commented] (SPARK-8677) Decimal divide operation throws ArithmeticException

    [ https://issues.apache.org/jira/browse/SPARK-8677?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14604110#comment-14604110 ] 

Apache Spark commented on SPARK-8677:
-------------------------------------

User 'viirya' has created a pull request for this issue:
https://github.com/apache/spark/pull/7056

> Decimal divide operation throws ArithmeticException
> ---------------------------------------------------
>
>                 Key: SPARK-8677
>                 URL: https://issues.apache.org/jira/browse/SPARK-8677
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Liang-Chi Hsieh
>
> Please refer to [BigDecimal doc|http://docs.oracle.com/javase/1.5.0/docs/api/java/math/BigDecimal.html]
> {quote}
> ... the rounding mode setting of a MathContext object with a precision setting of 0 is not used and thus irrelevant. In the case of divide, the exact quotient could have an infinitely long decimal expansion; for example, 1 divided by 3.
> {quote}
> Because we provide a MathContext.UNLIMITED in toBigDecimal, Decimal divide operation will throw the following exception:
> {code}
> val decimal = Decimal(1.0, 10, 3) / Decimal(3.0, 10, 3)
> [info]   java.lang.ArithmeticException: Non-terminating decimal expansion; no exact representable decimal result.
> [info]   at java.math.BigDecimal.divide(BigDecimal.java:1690)
> [info]   at java.math.BigDecimal.divide(BigDecimal.java:1723)
> [info]   at scala.math.BigDecimal.$div(BigDecimal.scala:256)
> [info]   at org.apache.spark.sql.types.Decimal.$div(Decimal.scala:272)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org