You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Ashutosh Chauhan (JIRA)" <ji...@apache.org> on 2016/07/10 17:29:10 UTC

[jira] [Updated] (HIVE-14113) Create function failed but function in show function list

     [ https://issues.apache.org/jira/browse/HIVE-14113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ashutosh Chauhan updated HIVE-14113:
------------------------------------
       Resolution: Fixed
         Assignee: niklaus xiao  (was: Navis)
    Fix Version/s:     (was: 1.3.0)
                   2.2.0
           Status: Resolved  (was: Patch Available)

Committed to master. Thanks, Niklaus!

> Create function failed but function in show function list
> ---------------------------------------------------------
>
>                 Key: HIVE-14113
>                 URL: https://issues.apache.org/jira/browse/HIVE-14113
>             Project: Hive
>          Issue Type: Bug
>          Components: UDF
>    Affects Versions: 1.2.0
>            Reporter: niklaus xiao
>            Assignee: niklaus xiao
>             Fix For: 2.2.0
>
>         Attachments: HIVE-14113.1.patch
>
>
> 1. create function with invalid hdfs path, /udf/udf-test.jar does not exists
> {quote}
> create function my_lower as 'com.tang.UDFLower' using jar 'hdfs:///udf/udf-test.jar';
> {quote}
> Failed with following exception:
> {quote}
> 0: jdbc:hive2://189.39.151.44:10000/> create function my_lower as 'com.tang.UDFLower' using jar 'hdfs:///udf/udf-test.jar';
> INFO  : converting to local hdfs:///udf/udf-test.jar
> ERROR : Failed to read external resource hdfs:///udf/udf-test.jar
> java.lang.RuntimeException: Failed to read external resource hdfs:///udf/udf-test.jar
> 	at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1384)
> 	at org.apache.hadoop.hive.ql.session.SessionState.resolveAndDownload(SessionState.java:1340)
> 	at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1264)
> 	at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1250)
> 	at org.apache.hadoop.hive.ql.exec.FunctionTask.addFunctionResources(FunctionTask.java:306)
> 	at org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:466)
> 	at org.apache.hadoop.hive.ql.exec.Registry.registerPermanentFunction(Registry.java:206)
> 	at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerPermanentFunction(FunctionRegistry.java:1551)
> 	at org.apache.hadoop.hive.ql.exec.FunctionTask.createPermanentFunction(FunctionTask.java:136)
> 	at org.apache.hadoop.hive.ql.exec.FunctionTask.execute(FunctionTask.java:75)
> 	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:158)
> 	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:101)
> 	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1965)
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1723)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1475)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1283)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1278)
> 	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:167)
> 	at org.apache.hive.service.cli.operation.SQLOperation.access$200(SQLOperation.java:75)
> 	at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:245)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1711)
> 	at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:258)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.FileNotFoundException: File does not exist: hdfs:/udf/udf-test.jar
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1391)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1383)
> 	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1383)
> 	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:340)
> 	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
> 	at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2034)
> 	at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2003)
> 	at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1979)
> 	at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1370)
> 	... 28 more
> ERROR : Failed to register default.my_lower using class com.tang.UDFLower
> Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.FunctionTask (state=08S01,code=1)
> {quote}
> 2. Execute show functions, the failed function my_lower is in the function list
> {quote}
> 0: jdbc:hive2://189.39.151.44:21066/> show functions;
> +-------------------------+--+
> |        tab_name         |
> +-------------------------+--+
> | day                     |
> | dayofmonth              |
> | decode                  |
> | default.my_lower       |
> | degrees                 |
> | dense_rank              |
> 0: jdbc:hive2://189.39.151.44:10000/> select my_lower(name) from stu;
> Error: Error while compiling statement: FAILED: SemanticException [Error 10011]: Invalid function my_lower (state=42000,code=10011)
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)