You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@arrow.apache.org by "minyoung (via GitHub)" <gi...@apache.org> on 2023/02/01 01:19:13 UTC

[GitHub] [arrow] minyoung commented on issue #33875: [Go] How to handle large lists?

minyoung commented on issue #33875:
URL: https://github.com/apache/arrow/issues/33875#issuecomment-1411308030

   @zeroshade I had a go at implementing support for LargeString/LargeBinary: https://github.com/apache/arrow/pull/33965
   
   A potential gotcha I stumbled on though is that while we can now write out `array.LargeString`, on reads we get `array.String` back. If you're only reading a subset of the dataset at a time (e.g. using the [RecordReader](https://pkg.go.dev/github.com/apache/arrow/go/v10@v10.0.1/parquet/pqarrow#FileReader.GetRecordReader)), then things are fine. If you try to read the entire dataset (or column) at once though, then we get back to the original error in this issue.
   
   Presumably writing with: `pqarrow.NewArrowWriterProperties(pqarrow.WithStoreSchema())` is supposed to fix this but just hasn't been implemented yet?
   
   Aside: I found it humorous that the cpp side of things also [errors out](https://github.com/apache/arrow/issues/21526#issuecomment-1377761216)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@arrow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org