You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@arrow.apache.org by "Ben Kietzman (Jira)" <ji...@apache.org> on 2021/02/09 17:51:00 UTC
[jira] [Resolved] (ARROW-10406) [C++] Unify dictionaries when
writing IPC file in a single shot
[ https://issues.apache.org/jira/browse/ARROW-10406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ben Kietzman resolved ARROW-10406.
----------------------------------
Resolution: Fixed
Issue resolved by pull request 9348
[https://github.com/apache/arrow/pull/9348]
> [C++] Unify dictionaries when writing IPC file in a single shot
> ---------------------------------------------------------------
>
> Key: ARROW-10406
> URL: https://issues.apache.org/jira/browse/ARROW-10406
> Project: Apache Arrow
> Issue Type: Wish
> Components: C++
> Reporter: Neal Richardson
> Assignee: Antoine Pitrou
> Priority: Major
> Labels: pull-request-available
> Fix For: 4.0.0
>
> Time Spent: 2h 40m
> Remaining Estimate: 0h
>
> I read a big (taxi) csv file and specified that I wanted to dictionary-encode some columns. The resulting Table has ChunkedArrays with 1604 chunks. When I go to write this Table to the IPC file format (write_feather), I get an error:
> {code}
> Invalid: Dictionary replacement detected when writing IPC file format. Arrow IPC files only support a single dictionary for a given field accross all batches.
> {code}
> I can write this to Parquet and read it back in, and the roundtrip of the data is correct. We should be able to do this in IPC too.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)