You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Udbhav Agrawal (JIRA)" <ji...@apache.org> on 2019/01/02 11:24:00 UTC
[jira] [Issue Comment Deleted] (SPARK-26454) While creating new UDF
with JAR though UDF is created successfully, it throws IllegegalArgument
Exception
[ https://issues.apache.org/jira/browse/SPARK-26454?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Udbhav Agrawal updated SPARK-26454:
-----------------------------------
Comment: was deleted
(was: [https://github.com/apache/spark/blob/02b510728c31b70e6035ad541bfcdc2b59dcd79a/core/src/main/scala/org/apache/spark/rpc/netty/NettyStreamManager.scala#L78] )
> While creating new UDF with JAR though UDF is created successfully, it throws IllegegalArgument Exception
> ---------------------------------------------------------------------------------------------------------
>
> Key: SPARK-26454
> URL: https://issues.apache.org/jira/browse/SPARK-26454
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 2.3.2
> Reporter: Udbhav Agrawal
> Priority: Trivial
> Attachments: create_exception.txt
>
>
> 【Test step】:
> 1.launch spark-shell
> 2. set role admin;
> 3. create new function
> CREATE FUNCTION Func AS 'org.apache.hadoop.hive.ql.udf.generic.GenericUDFLastDayTest' USING JAR 'hdfs:///tmp/super_udf/two_udfs.jar'
> 4. Do select on the function
> sql("select Func('2018-03-09')").show()
> 5.Create new UDF with same JAR
> sql("CREATE FUNCTION newFunc AS 'org.apache.hadoop.hive.ql.udf.generic.GenericUDFLastDayTest' USING JAR 'hdfs:///tmp/super_udf/two_udfs.jar'")
> 6. Do select on the new function created.
> sql("select newFunc ('2018-03-09')").show()
> 【Output】:
> Function is getting created but illegal argument exception is thrown , select provides result but with illegal argument exception.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org