You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@arrow.apache.org by "Ishan (Jira)" <ji...@apache.org> on 2020/09/10 07:14:00 UTC
[jira] [Updated] (ARROW-9958) Error writing record batches to IPC
streaming format
[ https://issues.apache.org/jira/browse/ARROW-9958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ishan updated ARROW-9958:
-------------------------
Description:
Writing record batches to the Arrow IPC streaming format with on-the-fly compression generally raises errors of one type or the other.
PFA the code producing each of the below errors. I can't reproduce it for smaller batch sizes, so it probably has to do with size of each record batch. It does not seem specific to pyarrow since I see a similar issue with the C-Glib API.
#Error case 1
```
~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in pyarrow.lib._CRecordBatchReader.read_next_batch()
~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
OSError: Truncated compressed stream
```
#Error case 2
```
~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in pyarrow.lib._RecordBatchStreamReader._open()
~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.pyarrow_internal_check_status()
~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowInvalid: Tried reading schema message, was null or length 0
```
was:
Writing record batches to the Arrow IPC streaming format with on-the-fly compression generally raises errors of one type or the other.
PFA the code producing each of the below errors. I can't reproduce it for smaller batch sizes, so it probably has to do with size of each record batch. It does not seem specific to pyarrow since I see a similar issue with the C-Glib API.
# Error case 1
```
~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in pyarrow.lib._CRecordBatchReader.read_next_batch()
~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
OSError: Truncated compressed stream
```
# Error case 2
```
~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in pyarrow.lib._RecordBatchStreamReader._open()
~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.pyarrow_internal_check_status()
~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
ArrowInvalid: Tried reading schema message, was null or length 0
```
> Error writing record batches to IPC streaming format
> ----------------------------------------------------
>
> Key: ARROW-9958
> URL: https://issues.apache.org/jira/browse/ARROW-9958
> Project: Apache Arrow
> Issue Type: Bug
> Components: GLib, Python
> Affects Versions: 1.0.1
> Environment: pyarrow - Version: 1.0.1
> python - version 3.7.6
> Operating system - CentOS Linux release 7.8.2003 (Core)
> Reporter: Ishan
> Priority: Major
> Labels: Compression
> Attachments: example1.py, example2.py
>
>
> Writing record batches to the Arrow IPC streaming format with on-the-fly compression generally raises errors of one type or the other.
> PFA the code producing each of the below errors. I can't reproduce it for smaller batch sizes, so it probably has to do with size of each record batch. It does not seem specific to pyarrow since I see a similar issue with the C-Glib API.
> #Error case 1
> ```
> ~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in pyarrow.lib._CRecordBatchReader.read_next_batch()
> ~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
> OSError: Truncated compressed stream
> ```
> #Error case 2
> ```
> ~/py376/lib/python3.7/site-packages/pyarrow/ipc.pxi in pyarrow.lib._RecordBatchStreamReader._open()
> ~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.pyarrow_internal_check_status()
> ~/py376/lib/python3.7/site-packages/pyarrow/error.pxi in pyarrow.lib.check_status()
> ArrowInvalid: Tried reading schema message, was null or length 0
> ```
--
This message was sent by Atlassian Jira
(v8.3.4#803005)