You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@arrow.apache.org by GitBox <gi...@apache.org> on 2020/12/03 17:18:17 UTC

[GitHub] [arrow] sweb commented on a change in pull request #8784: ARROW-10674: [Rust] Fix BigDecimal to be little endian; Add IPC Reader/Writer for Decimal type to allow integration tests

sweb commented on a change in pull request #8784:
URL: https://github.com/apache/arrow/pull/8784#discussion_r535429404



##########
File path: rust/arrow/src/ipc/convert.rs
##########
@@ -97,6 +97,12 @@ pub fn fb_to_schema(fb: ipc::Schema) -> Schema {
     let len = c_fields.len();
     for i in 0..len {
         let c_field: ipc::Field = c_fields.get(i);
+        match c_field.type_type() {

Review comment:
       @alamb thank you for being so nice about it - I was just too lazy to add a test and should receive full scrutiny ;)
   
   This is partly due to the fact that I am not very familiar with flatbuffers and still do not fully understand how to create the appropriate flatbuffer to test this. As a temporary solution, I have added two tests to `ipc::reader` that uses the BigEndian files in `arrow-ipc-stream/integration/1.0.0-bigendian`. The one for decimal fails, the others work. I hope this is okay for now, until I am able to construct the correct schema message to test this directly in `ipc::convert`.
   
   While adding the big endian test for the other types I noticed that the contents are not equal to the json content. That is why the test does not contain an equality check. Thus, there may be problems with Big Endian for other types as well.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org