You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Saurabh Santhosh (JIRA)" <ji...@apache.org> on 2014/12/10 06:37:12 UTC

[jira] [Updated] (SPARK-4811) Custom UDTFs not working in Spark SQL

     [ https://issues.apache.org/jira/browse/SPARK-4811?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Saurabh Santhosh updated SPARK-4811:
------------------------------------
    Description: 
I am using the Thrift srever interface to Spark SQL and using beeline to connect to it.
I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the following exception when using any custom UDTF.

These are the steps i did :

*Created a UDTF 'com.x.y.xxx'.*

Registered the UDTF using following query : 
*create temporary function xxx as 'com.x.y.xxx'*

The registration went through without any errors. But when i tried executing the UDTF i got the following error.

*java.lang.ClassNotFoundException: xxx*

Funny thing is that Its trying to load the function name instead of the funtion class. The exception is at *line no: 81 in hiveudfs.scala*
I have been at it for quite a long time.


  was:
I am using the Thrift srever interface to Spark SQL and using beeline to connect to it.
I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the following exception when using any custom UDTF.

These are the steps i did :

*Created a UDTF 'com.x.y.xxx'.*

Registered the UDTF using following query : 
*create temporary function xxx as 'com.x.y.xxx'*

The registration went through without any errors. But when i tried executing the UDTF i got the following error.

*java.lang.ClassNotFoundException: xxx*

Funny thing is that Its trying to load the function name instead of the funtion class. The exception is at *line no: 81 in hiveudfs.scala*




> Custom UDTFs not working in Spark SQL
> -------------------------------------
>
>                 Key: SPARK-4811
>                 URL: https://issues.apache.org/jira/browse/SPARK-4811
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0, 1.1.1
>            Reporter: Saurabh Santhosh
>             Fix For: 1.2.0
>
>
> I am using the Thrift srever interface to Spark SQL and using beeline to connect to it.
> I tried Spark SQL versions 1.1.0 and 1.1.1 and both are throwing the following exception when using any custom UDTF.
> These are the steps i did :
> *Created a UDTF 'com.x.y.xxx'.*
> Registered the UDTF using following query : 
> *create temporary function xxx as 'com.x.y.xxx'*
> The registration went through without any errors. But when i tried executing the UDTF i got the following error.
> *java.lang.ClassNotFoundException: xxx*
> Funny thing is that Its trying to load the function name instead of the funtion class. The exception is at *line no: 81 in hiveudfs.scala*
> I have been at it for quite a long time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org