You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Laurens (Jira)" <ji...@apache.org> on 2021/03/10 15:51:00 UTC

[jira] [Commented] (SPARK-33489) Support null for conversion from and to Arrow type

    [ https://issues.apache.org/jira/browse/SPARK-33489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17298921#comment-17298921 ] 

Laurens commented on SPARK-33489:
---------------------------------

I just hit this issue using spark 3.1.1 using Koalas 1.7.0
{code:java}
TypeError                                 Traceback (most recent call last)
/var/scratch/miniconda3/lib/python3.8/site-packages/pyspark/sql/udf.py in returnType(self)
    100             try:
--> 101                 to_arrow_type(self._returnType_placeholder)
    102             except TypeError:

/var/scratch/miniconda3/lib/python3.8/site-packages/pyspark/sql/pandas/types.py in to_arrow_type(dt)
     75     else:
---> 76         raise TypeError("Unsupported type in conversion to Arrow: " + str(dt))
     77     return arrow_type

TypeError: Unsupported type in conversion to Arrow: NullType
{code}

> Support null for conversion from and to Arrow type
> --------------------------------------------------
>
>                 Key: SPARK-33489
>                 URL: https://issues.apache.org/jira/browse/SPARK-33489
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.0.1
>            Reporter: Yuya Kanai
>            Assignee: Takuya Ueshin
>            Priority: Minor
>             Fix For: 3.2.0
>
>
> I got below error when using from_arrow_type() in pyspark.sql.pandas.types
> {{Unsupported type in conversion from Arrow: null}}
> I noticed NullType exists under pyspark.sql.types so it seems possible to convert from pyarrow null to pyspark null type and vice versa.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org