You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@arrow.apache.org by "mapleFU (via GitHub)" <gi...@apache.org> on 2023/06/09 07:48:39 UTC

[GitHub] [arrow] mapleFU commented on issue #36001: [Ruby][Question] Is it possible to limit memory (RAM) usage when reading very large Parquet files?

mapleFU commented on issue #36001:
URL: https://github.com/apache/arrow/issues/36001#issuecomment-1584127785

   Personally I suggest that you can try to use buffered row-group. With un-buffered io, all column-chunk in a row-group can be read. If it's buffered, each column will only read a `min(buffer-size, page-size)`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@arrow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org