You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/12/07 06:28:00 UTC

[jira] [Resolved] (SPARK-37539) Create Spark interface for creating UDFs

     [ https://issues.apache.org/jira/browse/SPARK-37539?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-37539.
----------------------------------
    Resolution: Duplicate

> Create Spark interface for creating UDFs
> ----------------------------------------
>
>                 Key: SPARK-37539
>                 URL: https://issues.apache.org/jira/browse/SPARK-37539
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.2.0
>            Reporter: Almaz Murzabekov
>            Priority: Major
>
> Currently, based on [spark documentation|https://spark.apache.org/docs/3.2.0/sql-ref-syntax-ddl-create-function.html], if you need to create a custom UDF with specific logic implemented in (let's say) Scala, you have to extend your class from:
>  # UDF or UDAF from org.apache.hadoop.hive.ql.exec package
>  # AbstractGenericUDAFResolver, GenericUDF, or GenericUDTF from org.apache.hadoop.hive.ql.exec package
>  # UserDefinedAggregateFunction in org.apache.spark.sql.expressions (which is deprecated)
> The third option is deprecated, and (I'm not sure) this option is used only for agg user functions. So, for plain UDF we have to follow these first two options. That means we MUST add a dependency for the hive-exec. Maybe, we can bring spark native interface for creating UDFs.
> WDYT guys? 
> Thanks in advance



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org