You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@arrow.apache.org by "Antoine Pitrou (Jira)" <ji...@apache.org> on 2019/11/07 11:24:00 UTC

[jira] [Commented] (ARROW-3410) [C++][Dataset] Streaming CSV reader interface for memory-constrainted environments

    [ https://issues.apache.org/jira/browse/ARROW-3410?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16969181#comment-16969181 ] 

Antoine Pitrou commented on ARROW-3410:
---------------------------------------

[~fsaintjacques] What kind of API would Datasets need from a streaming CSV reader? A RecordBatch iterator? Something else?

> [C++][Dataset] Streaming CSV reader interface for memory-constrainted environments
> ----------------------------------------------------------------------------------
>
>                 Key: ARROW-3410
>                 URL: https://issues.apache.org/jira/browse/ARROW-3410
>             Project: Apache Arrow
>          Issue Type: New Feature
>          Components: C++, C++ - Dataset
>            Reporter: Wes McKinney
>            Priority: Major
>              Labels: dataset
>
> CSV reads are currently all-or-nothing. If the results of parsing a CSV file do not fit into memory, this can be a problem. I propose to define a streaming {{RecordBatchReader}} interface so that the record batches produced by reading can be written out immediately to a stream on disk, to be memory mapped later



--
This message was sent by Atlassian Jira
(v8.3.4#803005)