You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "BELUGA BEHR (JIRA)" <ji...@apache.org> on 2018/09/21 21:19:00 UTC

[jira] [Comment Edited] (HIVE-14609) HS2 cannot drop a function whose associated jar file has been removed

    [ https://issues.apache.org/jira/browse/HIVE-14609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16624170#comment-16624170 ] 

BELUGA BEHR edited comment on HIVE-14609 at 9/21/18 9:18 PM:
-------------------------------------------------------------

By the same token, I cannot {{describe function}} either.
{code:java}
0: jdbc:hive2://host> describe function row_sequence;
INFO  : Compiling command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d): describe function row_sequence
INFO  : Semantic Analysis Completed
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d); Time taken: 0.286 seconds
INFO  : Executing command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d): describe function row_sequence
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : converting to local hdfs://ns1/tmp/hive-contrib-1.1.0.jar
ERROR : Failed to read external resource hdfs://ns1/tmp/hive-contrib-1.1.0.jar
java.lang.RuntimeException: Failed to read external resource hdfs://ns1/tmp/hive-contrib-1.1.0.jar
        at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1442)
        at org.apache.hadoop.hive.ql.session.SessionState.resolveAndDownload(SessionState.java:1398)
        at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1322)
        at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1308)
        at org.apache.hadoop.hive.ql.exec.FunctionTask.addFunctionResources(FunctionTask.java:304)
        at org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:570)
        at org.apache.hadoop.hive.ql.exec.Registry.getQualifiedFunctionInfo(Registry.java:556)
        at org.apache.hadoop.hive.ql.exec.Registry.getFunctionInfo(Registry.java:308)
        at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getFunctionInfo(FunctionRegistry.java:471)
        at org.apache.hadoop.hive.ql.exec.DDLTask.describeFunction(DDLTask.java:2907)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2054)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1750)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1503)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1287)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1282)
        at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:236)
        at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89)
        at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
        at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://ns1/tmp/hive-contrib-1.1.0.jar
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1270)
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1262)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:340)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2123)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2092)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2068)
        at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1428)
        ... 29 more

INFO  : Completed executing command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d); Time taken: 0.383 seconds
INFO  : OK
+------------------------------------------+--+
|                 tab_name                 |
+------------------------------------------+--+
| Function 'row_sequence' does not exist.  |
+------------------------------------------+--+
{code}


was (Author: belugabehr):
By the same token, I cannot {{describe function}} either to figure out where the missing JAR file is.

{code}
0: jdbc:hive2://host> describe function row_sequence;
INFO  : Compiling command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d): describe function row_sequence
INFO  : Semantic Analysis Completed
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d); Time taken: 0.286 seconds
INFO  : Executing command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d): describe function row_sequence
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : converting to local hdfs://ns1/tmp/hive-contrib-1.1.0.jar
ERROR : Failed to read external resource hdfs://ns1/tmp/hive-contrib-1.1.0.jar
java.lang.RuntimeException: Failed to read external resource hdfs://ns1/tmp/hive-contrib-1.1.0.jar
        at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1442)
        at org.apache.hadoop.hive.ql.session.SessionState.resolveAndDownload(SessionState.java:1398)
        at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1322)
        at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1308)
        at org.apache.hadoop.hive.ql.exec.FunctionTask.addFunctionResources(FunctionTask.java:304)
        at org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:570)
        at org.apache.hadoop.hive.ql.exec.Registry.getQualifiedFunctionInfo(Registry.java:556)
        at org.apache.hadoop.hive.ql.exec.Registry.getFunctionInfo(Registry.java:308)
        at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getFunctionInfo(FunctionRegistry.java:471)
        at org.apache.hadoop.hive.ql.exec.DDLTask.describeFunction(DDLTask.java:2907)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:385)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:99)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2054)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1750)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1503)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1287)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1282)
        at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:236)
        at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:89)
        at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:301)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
        at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:314)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://ns1/tmp/hive-contrib-1.1.0.jar
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1270)
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1262)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:340)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2123)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2092)
        at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2068)
        at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1428)
        ... 29 more

INFO  : Completed executing command(queryId=hive_20180921135555_3c26b2ae-9f0a-4a80-ba3c-a96b23fe8f9d); Time taken: 0.383 seconds
INFO  : OK
+------------------------------------------+--+
|                 tab_name                 |
+------------------------------------------+--+
| Function 'row_sequence' does not exist.  |
+------------------------------------------+--+
{code}

> HS2 cannot drop a function whose associated jar file has been removed
> ---------------------------------------------------------------------
>
>                 Key: HIVE-14609
>                 URL: https://issues.apache.org/jira/browse/HIVE-14609
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Yibing Shi
>            Assignee: Chaoyu Tang
>            Priority: Major
>
> Create a permanent function with below command:
> {code:sql}
> create function yshi.dummy as 'com.yshi.hive.udf.DummyUDF' using jar 'hdfs://host-10-17-81-142.coe.cloudera.com:8020/hive/jars/yshi.jar';
> {code}
> After that, delete the HDFS file {{hdfs://host-10-17-81-142.coe.cloudera.com:8020/hive/jars/yshi.jar}}, and *restart HS2 to remove the loaded class*.
> Now the function cannot be dropped:
> {noformat}
> 0: jdbc:hive2://10.17.81.144:10000/default> show functions yshi.dummy;
> INFO  : Compiling command(queryId=hive_20160821213434_d0271d77-84d8-45ba-8d92-3da1c143bded): show functions yshi.dummy
> INFO  : Semantic Analysis Completed
> INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
> INFO  : Completed compiling command(queryId=hive_20160821213434_d0271d77-84d8-45ba-8d92-3da1c143bded); Time taken: 1.259 seconds
> INFO  : Executing command(queryId=hive_20160821213434_d0271d77-84d8-45ba-8d92-3da1c143bded): show functions yshi.dummy
> INFO  : Starting task [Stage-0:DDL] in serial mode
> INFO  : SHOW FUNCTIONS is deprecated, please use SHOW FUNCTIONS LIKE instead.
> INFO  : Completed executing command(queryId=hive_20160821213434_d0271d77-84d8-45ba-8d92-3da1c143bded); Time taken: 0.024 seconds
> INFO  : OK
> +-------------+--+
> |  tab_name   |
> +-------------+--+
> | yshi.dummy  |
> +-------------+--+
> 1 row selected (3.877 seconds)
> 0: jdbc:hive2://10.17.81.144:10000/default> drop function yshi.dummy;
> INFO  : Compiling command(queryId=hive_20160821213434_47d14df5-59b3-4ebc-9a48-5e1d9c60c1fc): drop function yshi.dummy
> INFO  : converting to local hdfs://host-10-17-81-142.coe.cloudera.com:8020/hive/jars/yshi.jar
> ERROR : Failed to read external resource hdfs://host-10-17-81-142.coe.cloudera.com:8020/hive/jars/yshi.jar
> java.lang.RuntimeException: Failed to read external resource hdfs://host-10-17-81-142.coe.cloudera.com:8020/hive/jars/yshi.jar
>        	at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1200)
>        	at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1136)
>        	at org.apache.hadoop.hive.ql.session.SessionState.add_resources(SessionState.java:1126)
>        	at org.apache.hadoop.hive.ql.exec.FunctionTask.addFunctionResources(FunctionTask.java:304)
>        	at org.apache.hadoop.hive.ql.exec.Registry.registerToSessionRegistry(Registry.java:470)
>        	at org.apache.hadoop.hive.ql.exec.Registry.getQualifiedFunctionInfo(Registry.java:456)
>        	at org.apache.hadoop.hive.ql.exec.Registry.getFunctionInfo(Registry.java:245)
>        	at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getFunctionInfo(FunctionRegistry.java:455)
>        	at org.apache.hadoop.hive.ql.parse.FunctionSemanticAnalyzer.analyzeDropFunction(FunctionSemanticAnalyzer.java:99)
>        	at org.apache.hadoop.hive.ql.parse.FunctionSemanticAnalyzer.analyzeInternal(FunctionSemanticAnalyzer.java:61)
>        	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:222)
>        	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:451)
>        	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:311)
>        	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1194)
>        	at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1181)
>        	at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:134)
>        	at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:206)
>        	at org.apache.hive.service.cli.operation.Operation.run(Operation.java:316)
>        	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:425)
>        	at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:401)
>        	at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:258)
>        	at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:506)
>        	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
>        	at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
>        	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>        	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>        	at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:718)
>        	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>        	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>        	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>        	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.FileNotFoundException: File does not exist: hdfs://host-10-17-81-142.coe.cloudera.com:8020/hive/jars/yshi.jar
>        	at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1219)
>        	at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1211)
>        	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>        	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1211)
>        	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:340)
>        	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
>        	at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2014)
>        	at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1983)
>        	at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1959)
>        	at org.apache.hadoop.hive.ql.session.SessionState.downloadResource(SessionState.java:1186)
>        	... 30 more
> INFO  : Semantic Analysis Completed
> INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
> INFO  : Completed compiling command(queryId=hive_20160821213434_47d14df5-59b3-4ebc-9a48-5e1d9c60c1fc); Time taken: 0.297 seconds
> INFO  : Executing command(queryId=hive_20160821213434_47d14df5-59b3-4ebc-9a48-5e1d9c60c1fc): drop function yshi.dummy
> INFO  : Completed executing command(queryId=hive_20160821213434_47d14df5-59b3-4ebc-9a48-5e1d9c60c1fc); Time taken: 0.003 seconds
> INFO  : OK
> No rows affected (0.324 seconds)
> 0: jdbc:hive2://10.17.81.144:10000/default> show functions yshi.dummy;
> INFO  : Compiling command(queryId=hive_20160821213434_b69fd2a2-ec0b-463c-821c-3273d834c0ab): show functions yshi.dummy
> INFO  : Semantic Analysis Completed
> INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
> INFO  : Completed compiling command(queryId=hive_20160821213434_b69fd2a2-ec0b-463c-821c-3273d834c0ab); Time taken: 0.123 seconds
> INFO  : Executing command(queryId=hive_20160821213434_b69fd2a2-ec0b-463c-821c-3273d834c0ab): show functions yshi.dummy
> INFO  : Starting task [Stage-0:DDL] in serial mode
> INFO  : SHOW FUNCTIONS is deprecated, please use SHOW FUNCTIONS LIKE instead.
> INFO  : Completed executing command(queryId=hive_20160821213434_b69fd2a2-ec0b-463c-821c-3273d834c0ab); Time taken: 0.004 seconds
> INFO  : OK
> +-------------+--+
> |  tab_name   |
> +-------------+--+
> | yshi.dummy  |
> +-------------+--+
> 1 row selected (0.15 seconds)
> 0: jdbc:hive2://10.17.81.144:10000/default>
> {noformat}
> There are 2 problem to fix here:
> # Returns an error to client if the dop function operation fails.
> # Drop the function definition even though the associated jar file cannot be downloaded to local file system.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)