You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ABHISHEK KUMAR GUPTA (JIRA)" <ji...@apache.org> on 2019/08/09 05:59:00 UTC

[jira] [Created] (SPARK-28672) [UDF] Duplicate function creation should not allow

ABHISHEK KUMAR GUPTA created SPARK-28672:
--------------------------------------------

             Summary: [UDF] Duplicate function creation should not allow 
                 Key: SPARK-28672
                 URL: https://issues.apache.org/jira/browse/SPARK-28672
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.4.0
            Reporter: ABHISHEK KUMAR GUPTA


0: jdbc:hive2://10.18.18.214:23040/default> create function addm_3  AS 'com.huawei.bigdata.hive.example.udf.multiply' using jar 'hdfs://hacluster/user/Multiply.jar';
+---------+--+
| Result  |
+---------+--+
+---------+--+
No rows selected (0.084 seconds)
0: jdbc:hive2://10.18.18.214:23040/default> create temporary function addm_3  AS 'com.huawei.bigdata.hive.example.udf.multiply' using jar 'hdfs://hacluster/user/Multiply.jar';
INFO  : converting to local hdfs://hacluster/user/Multiply.jar
INFO  : Added [/tmp/8a396308-41f8-4335-9de4-8268ce5c70fe_resources/Multiply.jar] to class path
INFO  : Added resources: [hdfs://hacluster/user/Multiply.jar]
+---------+--+
| Result  |
+---------+--+
+---------+--+
No rows selected (0.134 seconds)
0: jdbc:hive2://10.18.18.214:23040/default> show functions like addm_3;
+-----------------+--+
|    function     |
+-----------------+--+
| addm_3          |
| default.addm_3  |
+-----------------+--+
2 rows selected (0.047 seconds)
0: jdbc:hive2://10.18.18.214:23040/default>
When show function executed it is listing both the function but what about the db for permanent function when user has not specified.
Duplicate should not be allowed if user creating temporary one with the same name.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org