You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@arrow.apache.org by "Florian Müller (Jira)" <ji...@apache.org> on 2020/11/24 16:52:00 UTC

[jira] [Commented] (ARROW-10674) [Rust] Add integration tests for Decimal type

    [ https://issues.apache.org/jira/browse/ARROW-10674?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17238253#comment-17238253 ] 

Florian Müller commented on ARROW-10674:
----------------------------------------

Hey [~nevi_me],

 

I started to work on the IPC reader for decimal but I ran into an interesting issue, maybe you can help me with this:

As far as I can tell, the values used in `arrow/testing/data/arrow-ipc-stream/integration/0.14.1/generated_decimal.arrow_file` are encoded as little endian. I was under the impression that big endian is used. This is based on the documentation in the [parquet crate|https://docs.rs/parquet/2.0.0/parquet/basic/enum.LogicalType.html#variant.DECIMAL]. Does the encoding always differ for arrow serialization or am I missing something?

 

In addition, the type is Decimal(3,2), however the values are e.g. -11697. My understanding so far was (again this is probably just true for parquet) that this values does not fit in Decimal(3,2) but if it is actually -116.97 it should be Decimal(5, 2).

 

I am a bit out of my depth here, so if you have some documentation as to when I have to deal with which encoding, that would be very helpful.

> [Rust] Add integration tests for Decimal type
> ---------------------------------------------
>
>                 Key: ARROW-10674
>                 URL: https://issues.apache.org/jira/browse/ARROW-10674
>             Project: Apache Arrow
>          Issue Type: Sub-task
>          Components: Rust
>            Reporter: Neville Dipale
>            Priority: Major
>
> We have basic decimal support, but we have not yet included decimals in the integration testing.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)