You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matyas Orhidi (JIRA)" <ji...@apache.org> on 2017/10/19 17:39:00 UTC

[jira] [Created] (SPARK-22314) Accessing Hive UDFs from Spark originally defined without 'USING JAR'

Matyas Orhidi created SPARK-22314:
-------------------------------------

             Summary: Accessing Hive UDFs from Spark originally defined without 'USING JAR'
                 Key: SPARK-22314
                 URL: https://issues.apache.org/jira/browse/SPARK-22314
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Matyas Orhidi


When defining UDF functions in Hive it is possible to load the UDF jar(s) from a shared location e.g. from hive.reloadable.aux.jars.path, and then use the CREATE FUNCTION statement:

{{CREATE FUNCTION <your_function_name> AS '<fully_qualified_class_name>';}}

These UDFs are not working from Spark unless you use the 
{{CREATE FUNCTION <your_function_name> AS '<fully_qualified_class_name>' USING JAR
                  'hdfs:///<path/to/jar/in/hdfs>';}}
command to create the Hive UDF function.




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org