You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@arrow.apache.org by "Wes McKinney (JIRA)" <ji...@apache.org> on 2017/05/24 21:14:05 UTC

[jira] [Commented] (ARROW-1067) Write to parquet with InMemoryOutputStream

    [ https://issues.apache.org/jira/browse/ARROW-1067?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16023697#comment-16023697 ] 

Wes McKinney commented on ARROW-1067:
-------------------------------------

Can you provide a reproducible example that fails? There isn't enough information here for me to reproduce the problem.

{{pq.write_table(table, f)}}

should work with any Python file-like object, so you can pass the ADL file object directly. If it does not work, let me know. 

> Write to parquet with InMemoryOutputStream
> ------------------------------------------
>
>                 Key: ARROW-1067
>                 URL: https://issues.apache.org/jira/browse/ARROW-1067
>             Project: Apache Arrow
>          Issue Type: Bug
>          Components: Python
>    Affects Versions: 0.4.0
>         Environment: Debian 8.5, Anaconda Python 3.6, pyarrow 0.4.0a0
>            Reporter: Chase Slater
>            Priority: Minor
>              Labels: documentation
>
> When I run the following (from the docs) python crashes during the pq.write_table statement.  How would I go about writing a parquet file to a file buffer (e.g. for use with Azure Data Lake)?
> {code}
> import pyarrow as pa
> import pyarrow.parquet as pq
> table = pa.Table.from_pandas(df, timestamps_to_ms=True)
> with adl.open(my_file_path, 'wb') as f:
>     output = pa.InMemoryOutputStream()
>     pq.write_table(table, output) # crashes here
>     f.write(output.get_result().to_pybytes())
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)