You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Achilleus 003 <ac...@gmail.com> on 2019/03/28 04:45:25 UTC

Udfs in spark

Couple of questions regarding udfs:
1) Is there a way to get all the registered UDFs in spark scala?
I couldn’t find any straight forward api. But found a pattern to get all the registered udfs. 
Spark.catalog.listfunctions.filter(_.className == null).collect

This does the trick but not sure it will hold true in all the cases.Is there a better way to get all the registered udfs?

2) is there way i can share my udfs across session when not using databricks notebook?
 

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org