You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Philip Dakin (Jira)" <ji...@apache.org> on 2023/10/18 21:48:00 UTC

[jira] [Comment Edited] (SPARK-44734) Add documentation for type casting rules in Python UDFs/UDTFs

    [ https://issues.apache.org/jira/browse/SPARK-44734?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17776921#comment-17776921 ] 

Philip Dakin edited comment on SPARK-44734 at 10/18/23 9:47 PM:
----------------------------------------------------------------

[~panbingkun] please make sure changes operate well with https://issues.apache.org/jira/browse/SPARK-44733 in [https://github.com/apache/spark/pull/43369]


was (Author: JIRAUSER302581):
[~panbingkun] please make sure changes operate well with https://github.com/apache/spark/pull/43369.

> Add documentation for type casting rules in Python UDFs/UDTFs
> -------------------------------------------------------------
>
>                 Key: SPARK-44734
>                 URL: https://issues.apache.org/jira/browse/SPARK-44734
>             Project: Spark
>          Issue Type: Sub-task
>          Components: PySpark
>    Affects Versions: 4.0.0
>            Reporter: Allison Wang
>            Priority: Major
>
> In addition to type mappings between Spark data types and Python data types (SPARK-44733), we should add the type casting rules for regular and arrow-optimized Python UDFs/UDTFs. 
> We currently have this table in code:
>  * Arrow: [https://github.com/apache/spark/blob/master/python/pyspark/sql/pandas/functions.py#L311-L329]
>  * Python UDF: [https://github.com/apache/spark/blob/master/python/pyspark/sql/udf.py#L101-L116]
> We should add a proper documentation page for the type casting rules. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org