You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Leif Walsh (JIRA)" <ji...@apache.org> on 2018/08/19 22:45:00 UTC
[jira] [Comment Edited] (SPARK-21187) Complete support for
remaining Spark data types in Arrow Converters
[ https://issues.apache.org/jira/browse/SPARK-21187?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16585286#comment-16585286 ]
Leif Walsh edited comment on SPARK-21187 at 8/19/18 10:44 PM:
--------------------------------------------------------------
[~bryanc] is there anything I can help elaborate on, or do you just need to decide whether or not to do it? (regarding multi-indexing)
was (Author: leif):
[~bryanc] is there anything I can help elaborate on, or do you just need to decide whether or not to do it?
> Complete support for remaining Spark data types in Arrow Converters
> -------------------------------------------------------------------
>
> Key: SPARK-21187
> URL: https://issues.apache.org/jira/browse/SPARK-21187
> Project: Spark
> Issue Type: Umbrella
> Components: PySpark, SQL
> Affects Versions: 2.3.0
> Reporter: Bryan Cutler
> Assignee: Bryan Cutler
> Priority: Major
>
> This is to track adding the remaining type support in Arrow Converters. Currently, only primitive data types are supported. '
> Remaining types:
> * -*Date*-
> * -*Timestamp*-
> * *Complex*: Struct, -Array-, Arrays of Date/Timestamps, Map
> * -*Decimal*-
> * -*Binary*-
> Some things to do before closing this out:
> * -Look to upgrading to Arrow 0.7 for better Decimal support (can now write values as BigDecimal)-
> * -Need to add some user docs-
> * -Make sure Python tests are thorough-
> * Check into complex type support mentioned in comments by [~leif], should we support mulit-indexing?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org