You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@arrow.apache.org by "Antoine Pitrou (JIRA)" <ji...@apache.org> on 2019/07/17 15:16:00 UTC

[jira] [Commented] (ARROW-5966) [Python] Capacity error when converting large string numpy array to arrow array

    [ https://issues.apache.org/jira/browse/ARROW-5966?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16887155#comment-16887155 ] 

Antoine Pitrou commented on ARROW-5966:
---------------------------------------

I am not seeing this issue:

{code:python}
>>> import pyarrow as pa                                                                                                                                                    
>>> import numpy as np                                                                                                                                                      
>>>                                                                                                                                                                         
>>> l = []                                                                                                                                                                  
>>> x = b"x"*1024                                                                                                                                                           
>>> for i in range(4 * (1024**2)): l.append(x)                                                                                                                              
>>> arr = pa.array(l)
>>> arr.type                                                                                                                                                                
DataType(binary)
>>> type(arr)                                                                                                                                                               
pyarrow.lib.ChunkedArray
>>> len(arr)                                                                                                                                                                
4194304
>>> len(arr.chunks)                                                                                                                                                         
3
>>> del arr                                                                                                                                                                 
>>> narr = np.array(l)
>>> narr.nbytes                                                                                                                                                             
4294967296
>>> arr = pa.array(narr)                                                                                                                                                    
>>> type(arr)                                                                                                                                                               
pyarrow.lib.ChunkedArray
>>> len(arr.chunks)                                                                                                                                                         
256
>>> len(arr)                                                                                                                                                                
4194304
{code}


> [Python] Capacity error when converting large string numpy array to arrow array
> -------------------------------------------------------------------------------
>
>                 Key: ARROW-5966
>                 URL: https://issues.apache.org/jira/browse/ARROW-5966
>             Project: Apache Arrow
>          Issue Type: Bug
>          Components: Python
>    Affects Versions: 0.13.0, 0.14.0
>            Reporter: Igor Yastrebov
>            Priority: Major
>
> Trying to create a large string array fails with 
> ArrowCapacityError: Encoded string length exceeds maximum size (2GB)
> instead of creating a chunked array.
>  
> A reproducible example:
> {code:java}
> import uuid
> import numpy as np
> import pyarrow as pa
> li = []
> for i in range(100000000):
>     li.append(uuid.uuid4().hex)
> arr = np.array(li)
> parr = pa.array(arr)
> {code}
> Is it a regression or was it never properly fixed: [https://github.com/apache/arrow/issues/1855]?
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)