You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by 许新浩 <94...@qq.com.INVALID> on 2023/04/14 08:25:33 UTC

How to create spark udf use functioncatalog?

We are using spark.Today I see the FunctionCatalog , and I have&nbsp;seen the source of spark\sql\core\src\test\scala\org\apache\spark\sql\connector\DataSourceV2FunctionSuite.scala&nbsp; and have implements the&nbsp;ScalarFunction.But i still not konw how to&nbsp;register&nbsp;it in sql

Re: How to create spark udf use functioncatalog?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

I'm not sure I understand the question, but if your question is how to
register (plug-in) your own custom FunctionCatalog, it's through
spark.sql.catalog configuration property, e.g.

spark.sql.catalog.catalog-name=com.example.YourCatalogClass

spark.sql.catalog registers a CatalogPlugin that in your case is also
supposed to be a FunctionCatalog.

When needed, implicit class CatalogHelper.asFunctionCatalog is going to be
used to offer your custom CatalogPlugin (e.g., catalog-name above) so
functions identified by three-part identifiers (catalog.schema.function)
are resolved and used properly using the custom catalog impl.

HTH

Pozdrawiam,
Jacek Laskowski
----
"The Internals Of" Online Books <https://books.japila.pl/>
Follow me on https://twitter.com/jaceklaskowski

<https://twitter.com/jaceklaskowski>


On Fri, Apr 14, 2023 at 2:10 PM 许新浩 <94...@qq.com.invalid> wrote:

> We are using spark.Today I see the FunctionCatalog , and I have seen the
> source of
> spark\sql\core\src\test\scala\org\apache\spark\sql\connector\DataSourceV2FunctionSuite.scala
> and have implements the ScalarFunction.But i still not konw how
> to register it in sql