You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Craig Condit (JIRA)" <ji...@apache.org> on 2014/05/03 01:47:16 UTC

[jira] [Updated] (HIVE-6915) Hive Hbase queries fail on secure Tez cluster

     [ https://issues.apache.org/jira/browse/HIVE-6915?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Craig Condit updated HIVE-6915:
-------------------------------

    Attachment: HIVE-6915.2.patch

New patch version. This version calls TableMapReduceUtil.initCredentials() only in the case where the current user is logged in via Kerberos.

> Hive Hbase queries fail on secure Tez cluster
> ---------------------------------------------
>
>                 Key: HIVE-6915
>                 URL: https://issues.apache.org/jira/browse/HIVE-6915
>             Project: Hive
>          Issue Type: Bug
>          Components: Tez
>    Affects Versions: 0.13.0
>         Environment: Kerberos secure Tez cluster
>            Reporter: Deepesh Khandelwal
>            Assignee: Siddharth Seth
>         Attachments: HIVE-6915.1.patch, HIVE-6915.2.patch
>
>
> Hive queries reading and writing to HBase are currently failing with the following exception in a secure Tez cluster:
> {noformat}
> 2014-04-14 13:47:05,644 FATAL [InputInitializer [Map 1] #0] org.apache.hadoop.ipc.RpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> 	at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:152)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupSaslConnection(RpcClient.java:792)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection.access$800(RpcClient.java:349)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:918)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection$2.run(RpcClient.java:915)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection.setupIOstreams(RpcClient.java:915)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection.writeRequest(RpcClient.java:1065)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$Connection.tracedWriteRequest(RpcClient.java:1032)
> 	at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1474)
> 	at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1684)
> 	at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1737)
> 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:29288)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1562)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:87)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:84)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:121)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:97)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:90)
> 	at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callBlockingMethod(CoprocessorRpcChannel.java:67)
> 	at org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:60)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil$3.run(TokenUtil.java:174)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil$3.run(TokenUtil.java:172)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil.obtainTokenForJob(TokenUtil.java:171)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:601)
> 	at org.apache.hadoop.hbase.util.Methods.call(Methods.java:39)
> 	at org.apache.hadoop.hbase.security.User$SecureHadoopUser.obtainAuthTokenForJob(User.java:334)
> 	at org.apache.hadoop.hbase.mapred.TableMapReduceUtil.initCredentials(TableMapReduceUtil.java:201)
> 	at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:415)
> 	at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:291)
> 	at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:372)
> 	at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat.getSplits(TezGroupedSplitsInputFormat.java:68)
> 	at org.apache.tez.mapreduce.hadoop.MRHelpers.generateOldSplits(MRHelpers.java:263)
> 	at org.apache.tez.mapreduce.common.MRInputAMSplitGenerator.initialize(MRInputAMSplitGenerator.java:139)
> 	at org.apache.tez.dag.app.dag.RootInputInitializerRunner$InputInitializerCallable$1.run(RootInputInitializerRunner.java:154)
> 	at org.apache.tez.dag.app.dag.RootInputInitializerRunner$InputInitializerCallable$1.run(RootInputInitializerRunner.java:146)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
> 	at org.apache.tez.dag.app.dag.RootInputInitializerRunner$InputInitializerCallable.call(RootInputInitializerRunner.java:146)
> 	at org.apache.tez.dag.app.dag.RootInputInitializerRunner$InputInitializerCallable.call(RootInputInitializerRunner.java:114)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:722)
> Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
> 	at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> 	at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> 	at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> 	at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> 	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> 	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> 	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> 	... 55 more
> {noformat}
> kinit was performed by the client. The error appears in the Tez application logs. The queries work fine when i change the hive.execution.engine to MR.



--
This message was sent by Atlassian JIRA
(v6.2#6252)