You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Olaf Flebbe (JIRA)" <ji...@apache.org> on 2015/04/07 20:40:13 UTC

[jira] [Created] (HIVE-10240) Patch HIVE-9473 breaks KERBEROS

Olaf Flebbe created HIVE-10240:
----------------------------------

             Summary: Patch HIVE-9473 breaks KERBEROS
                 Key: HIVE-10240
                 URL: https://issues.apache.org/jira/browse/HIVE-10240
             Project: Hive
          Issue Type: Bug
          Components: Authentication, HiveServer2
    Affects Versions: 1.0.0
            Reporter: Olaf Flebbe
             Fix For: 1.0.1


The patch from HIVE-9473 introduces a regression. Hive-Server2 does not start properly any more for our config (more or less the bigtop environment)

sql std auth enabled, enableDoAs disabled, tez enabled, kerberos enabled.

Problem seems to be that the kerberos ticket is not present when hive-server2 tries first to access HDFS. When HIVE-9473 is reverted getting the ticket is one of the first things hive-server2 does.


Posting startup of vanilla hive-1.0.0 and startup of a hive-1.0.0 with this commit revoked, where hive-server2 correctly starts.
{code}
commit 35582c2065a6b90b003a656bdb3b0ff08b0c35b9
Author: Thejas Nair <th...@apache.org>
Date:   Fri Jan 30 00:05:50 2015 +0000

    HIVE-9473 : sql std auth should disallow built-in udfs that allow any java methods to be called (Thejas Nair, reviewed by Jason Dere)
    
    git-svn-id: https://svn.apache.org/repos/asf/hive/branches/branch-1.0@1655891 13f79535-47bb-0310-9956-ffa450edef68
{code}
revoked.


Startup of vanilla hive-1.0.0 hive-server2 
{code}
STARTUP_MSG:   build = git://os2-debian80/net/os2-debian80/fs1/olaf/bigtop/output/hive/hive-1.0.0 -r 813996292c9f966109f990127ddd5673cf813125; compiled by 'olaf' on Tue Apr 7 09:33:01 CEST 2015
************************************************************/
2015-04-07 10:23:52,579 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(292)) - Starting HiveServer2
2015-04-07 10:23:53,104 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(556)) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-04-07 10:23:53,135 INFO  [main]: metastore.ObjectStore (ObjectStore.java:initialize(264)) - ObjectStore, initialize called
2015-04-07 10:23:54,775 INFO  [main]: metastore.ObjectStore (ObjectStore.java:getPMF(345)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Pa
rtition,Database,Type,FieldSchema,Order"
2015-04-07 10:23:56,953 INFO  [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-04-07 10:23:56,954 INFO  [main]: metastore.ObjectStore (ObjectStore.java:setConf(247)) - Initialized ObjectStore
2015-04-07 10:23:57,275 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(630)) - Added admin role in metastore
2015-04-07 10:23:57,276 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(639)) - Added public role in metastore
2015-04-07 10:23:58,241 WARN  [main]: ipc.Client (Client.java:run(675)) - Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2015-04-07 10:23:58,248 WARN  [main]: ipc.Client (Client.java:run(675)) - Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2015-04-07 10:23:58,249 INFO  [main]: retry.RetryInvocationHandler (RetryInvocationHandler.java:invoke(140)) - Exception while invoking getFileInfo of class ClientNamenodeProtocolTranslatorPB over node2.proto.bsi.de/192.168.100.22:8020 after 1 fail over attempts. Trying to fail over immediately.
java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "node2.proto.bsi.de/192.168.100.22"; destination host is: "node2.proto.bsi.de":8020; 
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
        at org.apache.hadoop.ipc.Client.call(Client.java:1472)
        at org.apache.hadoop.ipc.Client.call(Client.java:1399)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
        at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
        at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
        at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
        at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
        at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:520)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:478)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:430)
        at org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:123)
        at org.apache.hive.service.cli.CLIService.init(CLIService.java:81)
        at org.apache.hive.service.CompositeService.init(CompositeService.java:59)
        at org.apache.hive.service.server.HiveServer2.init(HiveServer2.java:89)
        at org.apache.hive.service.server.HiveServer2.startHiveServer2(HiveServer2.java:298)
        at org.apache.hive.service.server.HiveServer2.access$400(HiveServer2.java:65)
        at org.apache.hive.service.server.HiveServer2$StartOptionExecutor.execute(HiveServer2.java:508)
        at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:381)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:680)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
{code}

And now startup log of reverted patch
{code}
2015-04-07 16:38:40,862 INFO  [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(292)) - Starting HiveServer2
2015-04-07 16:38:41,384 INFO  [main]: security.UserGroupInformation (UserGroupInformation.java:loginUserFromKeytab(938)) - Login successful for user hive/node2.proto.bsi.de@PROTO.BSI.DE using keytab file /etc/hive.keytab
2015-04-07 16:38:41,384 INFO  [main]: cli.CLIService (CLIService.java:init(100)) - SPNego httpUGI not created, spNegoPrincipal: , ketabFile: 
2015-04-07 16:38:42,044 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(556)) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-04-07 16:38:42,074 INFO  [main]: metastore.ObjectStore (ObjectStore.java:initialize(264)) - ObjectStore, initialize called
2015-04-07 16:38:44,682 INFO  [main]: metastore.ObjectStore (ObjectStore.java:getPMF(345)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2015-04-07 16:38:46,762 INFO  [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-04-07 16:38:46,762 INFO  [main]: metastore.ObjectStore (ObjectStore.java:setConf(247)) - Initialized ObjectStore
2015-04-07 16:38:47,067 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(630)) - Added admin role in metastore
2015-04-07 16:38:47,068 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(639)) - Added public role in metastore
2015-04-07 16:38:48,101 INFO  [main]: session.SessionState (SessionState.java:createPath(558)) - Created local directory: /tmp/7a90155d-fbd9-4991-bce6-f583a9ef259f_resources
2015-04-07 16:38:48,111 INFO  [main]: session.SessionState (SessionState.java:createPath(558)) - Created HDFS directory: /tmp/hive/hive/7a90155d-fbd9-4991-bce6-f583a9ef259f
2015-04-07 16:38:48,113 INFO  [main]: session.SessionState (SessionState.java:createPath(558)) - Created local directory: /tmp/hive/7a90155d-fbd9-4991-bce6-f583a9ef259f
2015-04-07 16:38:48,118 INFO  [main]: session.SessionState (SessionState.java:createPath(558)) - Created HDFS directory: /tmp/hive/hive/7a90155d-fbd9-4991-bce6-f583a9ef259f/_tmp_space.db
2015-04-07 16:38:48,120 INFO  [main]: session.SessionState (SessionState.java:start(460)) - No Tez session required at this point. hive.execution.engine=mr.
2015-04-07 16:38:48,134 INFO  [main]: sqlstd.SQLStdHiveAccessController (SQLStdHiveAccessController.java:<init>(95)) - Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=7a90155d-fbd9-4991-bce6-f583a9ef259f, clientType=HIVESERVER2]
2015-04-07 16:38:48,148 INFO  [main]: service.CompositeService (SessionManager.java:initOperationLogRootDir(148)) - Operation log root directory is created: /tmp/hive/operation_logs
2015-04-07 16:38:48,153 INFO  [main]: service.CompositeService (SessionManager.java:createBackgroundOperationPool(96)) - HiveServer2: Background operation thread pool size: 100
2015-04-07 16:38:48,153 INFO  [main]: service.CompositeService (SessionManager.java:createBackgroundOperationPool(98)) - HiveServer2: Background operation thread wait queue size: 100
2015-04-07 16:38:48,153 INFO  [main]: service.CompositeService (SessionManager.java:createBackgroundOperationPool(101)) - HiveServer2: Background operation thread keepalive time: 10 seconds
2015-04-07 16:38:48,157 INFO  [main]: service.AbstractService (AbstractService.java:init(89)) - Service:OperationManager is inited.
2015-04-07 16:38:48,157 INFO  [main]: service.AbstractService (AbstractService.java:init(89)) - Service:SessionManager is inited.
2015-04-07 16:38:48,157 INFO  [main]: service.AbstractService (AbstractService.java:init(89)) - Service:CLIService is inited.
2015-04-07 16:38:48,158 INFO  [main]: service.AbstractService (AbstractService.java:init(89)) - Service:ThriftBinaryCLIService is inited.
2015-04-07 16:38:48,158 INFO  [main]: service.AbstractService (AbstractService.java:init(89)) - Service:HiveServer2 is inited.
2015-04-07 16:38:48,158 INFO  [main]: service.AbstractService (AbstractService.java:start(104)) - Service:OperationManager is started.
2015-04-07 16:38:48,158 INFO  [main]: service.AbstractService (AbstractService.java:start(104)) - Service:SessionManager is started.
2015-04-07 16:38:48,158 INFO  [main]: service.AbstractService (AbstractService.java:start(104)) - Service:CLIService is started.
2015-04-07 16:38:48,161 INFO  [main]: metastore.ObjectStore (ObjectStore.java:initialize(264)) - ObjectStore, initialize called
2015-04-07 16:38:48,167 INFO  [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-04-07 16:38:48,167 INFO  [main]: metastore.ObjectStore (ObjectStore.java:setConf(247)) - Initialized ObjectStore
2015-04-07 16:38:48,168 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(713)) - 0: get_databases: default
2015-04-07 16:38:48,168 INFO  [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(339)) - ugi=hive/node2.proto.bsi.de@PROTO.BSI.DE    ip=unknown-ip-addr      cmd=get_databases: default      
2015-04-07 16:38:48,193 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(713)) - 0: Shutting down the object store...
2015-04-07 16:38:48,193 INFO  [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(339)) - ugi=hive/node2.proto.bsi.de@PROTO.BSI.DE    ip=unknown-ip-addr      cmd=Shutting down the object store...   
2015-04-07 16:38:48,193 INFO  [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(713)) - 0: Metastore shutdown complete.
2015-04-07 16:38:48,194 INFO  [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(339)) - ugi=hive/node2.proto.bsi.de@PROTO.BSI.DE    ip=unknown-ip-addr      cmd=Metastore shutdown complete.
        
2015-04-07 16:38:48,194 INFO  [main]: service.AbstractService (AbstractService.java:start(104)) - Service:ThriftBinaryCLIService is started.
2015-04-07 16:38:48,194 INFO  [main]: service.AbstractService (AbstractService.java:start(104)) - Service:HiveServer2 is started.
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)