You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@arrow.apache.org by "Neal Richardson (Jira)" <ji...@apache.org> on 2020/06/16 16:39:00 UTC

[jira] [Commented] (ARROW-3308) [R] Convert R character vector with data exceeding 2GB to chunked array

    [ https://issues.apache.org/jira/browse/ARROW-3308?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17136801#comment-17136801 ] 

Neal Richardson commented on ARROW-3308:
----------------------------------------

What criteria do we use to determine that the data exceeds 2GB in Arrow (without trying first, catching the error, and retrying with chunks)? Or should we always use some chunked builder and then concatenate the resulting chunks if possible?

> [R] Convert R character vector with data exceeding 2GB to chunked array
> -----------------------------------------------------------------------
>
>                 Key: ARROW-3308
>                 URL: https://issues.apache.org/jira/browse/ARROW-3308
>             Project: Apache Arrow
>          Issue Type: New Feature
>          Components: R
>            Reporter: Wes McKinney
>            Assignee: Francois Saint-Jacques
>            Priority: Major
>             Fix For: 1.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)