You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2018/12/13 04:14:00 UTC

[jira] [Commented] (SPARK-26355) Add a workaround for PyArrow 0.11.

    [ https://issues.apache.org/jira/browse/SPARK-26355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16719733#comment-16719733 ] 

ASF GitHub Bot commented on SPARK-26355:
----------------------------------------

ueshin opened a new pull request #23305: [SPARK-26355][PYSPARK] Add a workaround for PyArrow 0.11.
URL: https://github.com/apache/spark/pull/23305
 
 
   ## What changes were proposed in this pull request?
   
   In PyArrow 0.11, there is a API breaking change.
   
   - [ARROW-1949](https://issues.apache.org/jira/browse/ARROW-1949) - [Python/C++] Add option to Array.from_pandas and pyarrow.array to perform unsafe casts.
   
   We should add a workaround to support PyArrow 0.11.
   
   ## How was this patch tested?
   
   In my local environment.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Add a workaround for PyArrow 0.11.
> ----------------------------------
>
>                 Key: SPARK-26355
>                 URL: https://issues.apache.org/jira/browse/SPARK-26355
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, SQL
>    Affects Versions: 2.4.0
>            Reporter: Takuya Ueshin
>            Priority: Major
>
> In PyArrow 0.11, there is a API breaking change.
> - ARROW-1949 - [Python/C++] Add option to Array.from_pandas and pyarrow.array to perform unsafe casts.
> We should add a workaround to support PyArrow 0.11.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org