You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@arrow.apache.org by "Antoine Pitrou (JIRA)" <ji...@apache.org> on 2018/08/27 17:12:00 UTC

[jira] [Comment Edited] (ARROW-501) [C++] Implement concurrent / buffering InputStream for streaming data use cases

    [ https://issues.apache.org/jira/browse/ARROW-501?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16593945#comment-16593945 ] 

Antoine Pitrou edited comment on ARROW-501 at 8/27/18 5:11 PM:
---------------------------------------------------------------

If we want to do this, I think a simple background thread is the right piece of logic. Note that by definition, it will probably incur additional memory copies, though, unless we add a method allowing partial / best-effort reads.

Edit : the comment about memory copies only applies if we want a Stream interface. If we define the chunk size upfront, and/or let the readahead wrapper choose chunk size dynamically, then it's not a problem.


was (Author: pitrou):
If we want to do this, I think a simple background thread is the right piece of logic. Note that by definition, it will probably incur additional memory copies, though, unless we add a method allowing partial / best-effort reads.

 

> [C++] Implement concurrent / buffering InputStream for streaming data use cases
> -------------------------------------------------------------------------------
>
>                 Key: ARROW-501
>                 URL: https://issues.apache.org/jira/browse/ARROW-501
>             Project: Apache Arrow
>          Issue Type: New Feature
>          Components: C++
>            Reporter: Wes McKinney
>            Priority: Major
>              Labels: csv
>             Fix For: 0.13.0
>
>
> Related to ARROW-500, when processing an input data stream, we may wish to continue buffering input (up to an maximum buffer size) in between synchronous Read calls



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)