You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@arrow.apache.org by "Antoine Pitrou (Jira)" <ji...@apache.org> on 2021/01/05 18:44:03 UTC

[jira] [Commented] (ARROW-10406) [C++] Support dictionary replacement when writing IPC files

    [ https://issues.apache.org/jira/browse/ARROW-10406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17259128#comment-17259128 ] 

Antoine Pitrou commented on ARROW-10406:
----------------------------------------

I think this was done in ARROW-6883 (including unit tests=. [~npr] Can you confirm this is ok for you?

> [C++] Support dictionary replacement when writing IPC files
> -----------------------------------------------------------
>
>                 Key: ARROW-10406
>                 URL: https://issues.apache.org/jira/browse/ARROW-10406
>             Project: Apache Arrow
>          Issue Type: New Feature
>          Components: C++
>            Reporter: Neal Richardson
>            Priority: Major
>             Fix For: 3.0.0
>
>
> I read a big (taxi) csv file and specified that I wanted to dictionary-encode some columns. The resulting Table has ChunkedArrays with 1604 chunks. When I go to write this Table to the IPC file format (write_feather), I get an error: 
> {code}
>   Invalid: Dictionary replacement detected when writing IPC file format. Arrow IPC files only support a single dictionary for a given field accross all batches.
> {code}
> I can write this to Parquet and read it back in, and the roundtrip of the data is correct. We should be able to do this in IPC too.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)