You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@avro.apache.org by "Ryan Skraba (Jira)" <ji...@apache.org> on 2020/06/01 06:49:00 UTC

[jira] [Updated] (AVRO-2837) Java DecimalConversion handling of scale and precision

     [ https://issues.apache.org/jira/browse/AVRO-2837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ryan Skraba updated AVRO-2837:
------------------------------
    Fix Version/s: 1.10.0
         Assignee: Matthew McMahon
       Resolution: Fixed
           Status: Resolved  (was: Patch Available)

Thanks for the PR and the fix!  This looks great!

To sum up the conversation, when a BigDecimal datum is being applied to a decimal logical type with a different precision and scale, it will cause an exception if the numeric value can't "fit" in the schema.

For precision 5 and scale 2, the following values are OK: 123.45, 00123.4, 23.4500

The following values are not OK: 9123, 1.234

> Java DecimalConversion handling of scale and precision
> ------------------------------------------------------
>
>                 Key: AVRO-2837
>                 URL: https://issues.apache.org/jira/browse/AVRO-2837
>             Project: Apache Avro
>          Issue Type: Bug
>          Components: java, logical types
>    Affects Versions: 1.8.2, 1.9.2
>            Reporter: Matthew McMahon
>            Assignee: Matthew McMahon
>            Priority: Major
>             Fix For: 1.10.0
>
>         Attachments: AVRO-2837.patch, AVRO-2837.patch
>
>
> Came across an interesting issue in Avro 1.8.2
> Configured a decimal logical type (Fixed type of size 12 with scale of 15 and precision of 28).
> Due to an upstream bug, a value of 1070464558597365250.000000000000000 (1.07046455859736525E+18 that is then rescaled to 15) appears, and the DecimalConversion attempts to turn it into a Fixed type.
> This should have failed, as it has a precision of 34 and won't fit into the 12 bytes (needs 14). However in 1.8.2 this still writes a value that downstream processing then works out is invalid and errors.
> Basically the top 2 bytes are thrown away.
> This problem is fixed in 1.9.2 due to the change in https://issues.apache.org/jira/browse/AVRO-2309 as this particular issue fails when it attempts to pass an offset of -2 to the System.arraycopy method.
> That seems ok, but is a bit of a red herring to the actual issue, and precision is still not actually being checked.
> Proposing a couple changes to the DecimalConversion:
>  * Check precision set on the decimal logical type. If value has greater precision then error with more informative message
>  * Still check scale and error if value has a greater scale. However if the scale in value is less, than it seems safe to update the scale and pad zero's rather than error
>  * Do this for both Bytes and Fixed types
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)