You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2020/07/12 00:47:00 UTC

[jira] [Resolved] (SPARK-32154) Use ExpressionEncoder for the return type of ScalaUDF to convert to catalyst type

     [ https://issues.apache.org/jira/browse/SPARK-32154?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Takeshi Yamamuro resolved SPARK-32154.
--------------------------------------
    Fix Version/s: 3.1.0
         Assignee: wuyi
       Resolution: Fixed

Resolved by https://github.com/apache/spark/pull/28979

> Use ExpressionEncoder for the return type of ScalaUDF to convert to catalyst type
> ---------------------------------------------------------------------------------
>
>                 Key: SPARK-32154
>                 URL: https://issues.apache.org/jira/browse/SPARK-32154
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: wuyi
>            Assignee: wuyi
>            Priority: Major
>             Fix For: 3.1.0
>
>
> Users now could register a UDF with Instant/LocalDate as return type even with 
> spark.sql.datetime.java8API.enabled=false. However, the UDF can only be really used with spark.sql.datetime.java8API.enabled=true. This could make users confused.
> The problem is we use ExpressionEncoder to ser/deser types when registering the UDF, but use Catalyst converters to ser/deser types, which is under control of  spark.sql.datetime.java8API.enabled,  when executing UDF.
> If we could also use ExpressionEncoder to ser/deser types, similar to what we do for input parameter types, the, UDF could support Instant/LocalDate, event other combined complex types as well.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org