You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Elazar Leibovich <el...@gmail.com> on 2014/01/15 21:57:56 UTC

"Server not found in Kerberos database" for MiniKDC server

Hi,

For educational purposes, I'm trying to set a minimal working secure Hadoop
cluster on my machine.

What I basically did is:

Add example.com to /etc/hosts

Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
users - {nn,dn,hdfs}@EXAMPLE.COM

Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
for Mac OS X java:

❯ ~/hadoopconf env HADOOP_OPTS
hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
-Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
-Djava.net.preferIPv4Stack=true


Set the proper hadoop configuration using the keytab and the hadoop users:

❯ ~/hadoopconf get --local
hdfs-site.xml dfs.datanode.address            = example.com:1004
core-site.xml fs.defaultFS                    = hdfs://example.com
hdfs-site.xml dfs.namenode.keytab.file        =
/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
hdfs-site.xml dfs.datanode.hostname           = example.com
hdfs-site.xml dfs.datanode.kerberos.principal = dn/EXAMPLE.COM@EXAMPLE.COM
hdfs-site.xml dfs.datanode.data.dir           =
/tmp/hadoop-eleibovi/dfs/data
hdfs-site.xml dfs.datanode.keytab.file        =
/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
hdfs-site.xml dfs.namenode.kerberos.principal = nn/EXAMPLE.COM@EXAMPLE.COM
core-site.xml hadoop.security.authorization   = true
core-site.xml hadoop.security.authentication  = kerberos
hdfs-site.xml dfs.datanode.dns.interface      = lo0
hdfs-site.xml dfs.datanode.http.address       = example.com:1006

Start the namenode service.
$ ./bin/hdfs
...
14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
127.0.0.1:8020
14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
for active state

Finally use the following short Java program to contact the namenode:

System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
UserGroupInformation.setConfiguration(conf);
        UserGroupInformation ugi = UserGroupInformation.
                loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
"/keytab");
 ugi.doAs(new PrivilegedExceptionAction<Object>() {
            @Override
            public Object run() throws Exception {
                final FileSystem fs = FileSystem.get(conf);
                fs.getFileStatus(new Path("/"));
         }
}

The exception I got is:

Exception in thread "main" java.io.IOException: Failed on local exception:
java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
[Caused by GSSException: No valid credentials provided (Mechanism level:
Server not found in Kerberos database (7) - Server not found in Kerberos
database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
destination host is: "example.com":8020;

I'll be glad to any help with debugging the problem.

Thanks,

I attach a full log with Kerberos debug turned on:

args: [-conf,
/Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
-conf,
/Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
value=[Rate of successful kerberos logins and latency (milliseconds)],
always=false, type=DEFAULT, sampleName=Ops)
2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
value=[Rate of failed kerberos logins and latency (milliseconds)],
always=false, type=DEFAULT, sampleName=Ops)
2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
group related metrics
Java config name:
/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
Loaded from Java config
2014-01-15 19:29:46 DEBUG Groups:180 -  Creating new Groups object
2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the
custom-built native-hadoop library...
2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load
native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
java.library.path
2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 -
java.library.path=/Users/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2014-01-15 19:29:46 WARN  NativeCodeLoader:62 - Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
Falling back to shell based
2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
cacheTimeout=300000
Java config name:
/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): nn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 53; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): nn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 69; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): nn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 61; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): nn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 61; type: 23
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): dn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 53; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): dn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 69; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): dn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 61; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): dn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 61; type: 23
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): eleibovi
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 59; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): eleibovi
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 75; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): eleibovi
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 67; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): eleibovi
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 67; type: 23
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 55; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 71; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 63; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 63; type: 23
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTab: load() entry length: 42; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTab: load() entry length: 58; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTab: load() entry length: 50; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>> KeyTab: load() entry length: 50; type: 23
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): root
>>> KeyTab: load() entry length: 42; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): root
>>> KeyTab: load() entry length: 58; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): root
>>> KeyTab: load() entry length: 50; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): root
>>> KeyTab: load() entry length: 50; type: 23
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
retries =3, #bytes=158
>>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
#bytes=158
>>>DEBUG: TCPClient reading 529 bytes
>>> KrbKdcReq send: #bytes read=529
>>> KdcAccessibility: remove localhost:50064
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM
2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
2014-01-15 19:29:47 DEBUG UserGroupInformation:125 - hadoop login commit
2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos
user:hdfs/EXAMPLE.COM@EXAMPLE.COM
2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
from:com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>>KERBEROS
2014-01-15 19:29:47 DEBUG BlockReaderLocal:326 -
dfs.client.use.legacy.blockreader.local = false
2014-01-15 19:29:47 DEBUG BlockReaderLocal:329 -
dfs.client.read.shortcircuit = false
2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 -
dfs.client.domain.socket.data.traffic = false
2014-01-15 19:29:47 DEBUG BlockReaderLocal:335 - dfs.domain.socket.path =
2014-01-15 19:29:47 DEBUG MetricsSystemImpl:220 - StartupProgress, NameNode
startup progress
2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry = null
2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=RPC_PROTOCOL_BUFFER,
rpcRequestWrapperClass=class
org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@379b0d9c
2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit local
reads and UNIX domain socket are disabled.
2014-01-15 19:29:47 DEBUG Shell:237 - Failed to detect a valid hadoop home
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:219)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:244)
 at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at
org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1539)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:492)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:445)
 at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2429)
 at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2463)
 at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2445)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363)
 at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:165)
at
com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:38)
 at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at
com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
 at
com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this
machine. So not using it.
2014-01-15 19:29:47 DEBUG Shell:320 - setsid exited with exit code 0
2014-01-15 19:29:47 DEBUG Client:371 - The ping interval is 60000 ms.
2014-01-15 19:29:47 DEBUG Client:636 - Connecting to
example.com/127.0.0.1:8020
2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message state:
NEGOTIATE

2014-01-15 19:29:47 DEBUG SaslRpcClient:370 - Received SASL message state:
NEGOTIATE
auths {
  method: "TOKEN"
  mechanism: "DIGEST-MD5"
  protocol: ""
  serverId: "default"
  challenge:
"realm=\"default\",nonce=\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
}
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "nn"
  serverId: "EXAMPLE.COM"
}

2014-01-15 19:29:47 DEBUG SaslRpcClient:259 - Get token info
proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
info:@org.apache.hadoop.security.token.TokenInfo(value=class
org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info
proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
serverPrincipal=dfs.namenode.kerberos.principal)
2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerberos
principal name for
protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/
EXAMPLE.COM@EXAMPLE.COM
2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL
GSSAPI(KERBEROS)  client to authenticate to service at EXAMPLE.COM
2014-01-15 19:29:47 DEBUG SaslRpcClient:172 - Use KERBEROS authentication
for protocol ClientNamenodeProtocolPB
Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
retries =3, #bytes=583
>>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
#bytes=583
>>>DEBUG: TCPClient reading 135 bytes
>>> KrbKdcReq send: #bytes read=135
>>> KdcAccessibility: remove localhost:50064
>>> KDCRep: init() encoding tag is 126 req type is 13
>>>KRBError:
 sTime is Wed Jan 15 19:29:47 IST 2014 1389806987000
 suSec is 0
 error code is 7
 error Message is Server not found in Kerberos database
 realm is EXAMPLE.COM
 sname is krbtgt/EXAMPLE.COM
 msgType is 30
2014-01-15 19:29:47 ERROR UserGroupInformation:1480 -
PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Server not
found in Kerberos database (7) - Server not found in Kerberos database)]
2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
>>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi
>> Acquire default native Credentials
>>> Obtained TGT from LSA: Credentials:
*...Here it looks like it tries to get the machines kerberos tickets...*
client=eleibovi@[snipped computer Kerberos server from /etc/krb5.conf]
server=krbtgt/[snipped computer Kerberos server from /etc/krb5.conf]
authTime=20140109035156Z
startTime=20140109115308Z
endTime=20140109215308Z
renewTill=20140116035156Z
flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
EType (int): 18
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
getKDCFromDNS using UDP
getKDCFromDNS using TCP
2014-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login
2014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit
2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos
user:null
2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - using local
user:UnixPrincipal: eleibovi
2014-01-15 19:30:18 DEBUG UserGroupInformation:696 - UGI loginUser:eleibovi
(auth:KERBEROS)
2014-01-15 19:30:18 WARN  Client:615 - Exception encountered while
connecting to the server : javax.security.sasl.SaslException: GSS initiate
failed [Caused by GSSException: No valid credentials provided (Mechanism
level: Server not found in Kerberos database (7) - Server not found in
Kerberos database)]
2014-01-15 19:30:18 ERROR UserGroupInformation:1480 -
PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate
failed [Caused by GSSException: No valid credentials provided (Mechanism
level: Server not found in Kerberos database (7) - Server not found in
Kerberos database)]
2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to
example.com/127.0.0.1:8020: javax.security.sasl.SaslException: GSS initiate
failed [Caused by GSSException: No valid credentials provided (Mechanism
level: Server not found in Kerberos database (7) - Server not found in
Kerberos database)]
java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
[Caused by GSSException: No valid credentials provided (Mechanism level:
Server not found in Kerberos database (7) - Server not found in Kerberos
database)]
at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)
at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
at org.apache.hadoop.ipc.Client.call(Client.java:1318)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1636)
at
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1117)
at
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1113)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:78)
at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1113)
at
com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:40)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at
com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at
com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
by GSSException: No valid credentials provided (Mechanism level: Server not
found in Kerberos database (7) - Server not found in Kerberos database)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at
org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
at
org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
... 32 more
Caused by: GSSException: No valid credentials provided (Mechanism level:
Server not found in Kerberos database (7) - Server not found in Kerberos
database)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 41 more
Caused by: KrbException: Server not found in Kerberos database (7) - Server
not found in Kerberos database
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192)
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203)
at
sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
at
sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
... 44 more
Caused by: KrbException: Identifier doesn't match expected value (906)
at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143)
at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66)
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:61)
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
... 50 more
2014-01-15 19:30:18 DEBUG Client:1107 - IPC Client (23781497) connection to
example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPLE.COM: closed

Re: "Server not found in Kerberos database" for MiniKDC server

Posted by Elazar Leibovich <el...@gmail.com>.
For the sake of completion.

The same settings worked in a Linux box.


On Wed, Jan 15, 2014 at 10:57 PM, Elazar Leibovich <el...@gmail.com>wrote:

> Hi,
>
> For educational purposes, I'm trying to set a minimal working secure
> Hadoop cluster on my machine.
>
> What I basically did is:
>
> Add example.com to /etc/hosts
>
> Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
> users - {nn,dn,hdfs}@EXAMPLE.COM
>
> Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
> for Mac OS X java:
>
> ❯ ~/hadoopconf env HADOOP_OPTS
> hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
> -Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> -Djava.net.preferIPv4Stack=true
>
>
> Set the proper hadoop configuration using the keytab and the hadoop users:
>
> ❯ ~/hadoopconf get --local
> hdfs-site.xml dfs.datanode.address            = example.com:1004
> core-site.xml fs.defaultFS                    = hdfs://example.com
> hdfs-site.xml dfs.namenode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.datanode.hostname           = example.com
> hdfs-site.xml dfs.datanode.kerberos.principal = dn/EXAMPLE.COM@EXAMPLE.COM
> hdfs-site.xml dfs.datanode.data.dir           =
> /tmp/hadoop-eleibovi/dfs/data
> hdfs-site.xml dfs.datanode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.namenode.kerberos.principal = nn/EXAMPLE.COM@EXAMPLE.COM
> core-site.xml hadoop.security.authorization   = true
> core-site.xml hadoop.security.authentication  = kerberos
> hdfs-site.xml dfs.datanode.dns.interface      = lo0
> hdfs-site.xml dfs.datanode.http.address       = example.com:1006
>
> Start the namenode service.
> $ ./bin/hdfs
> ...
> 14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
> 14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
> 127.0.0.1:8020
> 14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
> for active state
>
> Finally use the following short Java program to contact the namenode:
>
> System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
> UserGroupInformation.setConfiguration(conf);
>         UserGroupInformation ugi = UserGroupInformation.
>                 loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
> "/keytab");
>  ugi.doAs(new PrivilegedExceptionAction<Object>() {
>             @Override
>             public Object run() throws Exception {
>                 final FileSystem fs = FileSystem.get(conf);
>                 fs.getFileStatus(new Path("/"));
>          }
> }
>
> The exception I got is:
>
> Exception in thread "main" java.io.IOException: Failed on local exception:
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
> destination host is: "example.com":8020;
>
> I'll be glad to any help with debugging the problem.
>
> Thanks,
>
> I attach a full log with Kerberos debug turned on:
>
> args: [-conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
> -conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of successful kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of failed kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
> group related metrics
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> 2014-01-15 19:29:46 DEBUG Groups:180 -  Creating new Groups object
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the
> custom-built native-hadoop library...
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load
> native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
> java.library.path
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 -
> java.library.path=/Users/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
> 2014-01-15 19:29:46 WARN  NativeCodeLoader:62 - Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
> Falling back to shell based
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping
> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
> cacheTimeout=300000
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> >>> KdcAccessibility: reset
> >>> KdcAccessibility: reset
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
>  >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 59; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 75; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 55; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 71; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 23
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> KrbAsReq creating message
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=158
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=158
> >>>DEBUG: TCPClient reading 529 bytes
> >>> KrbKdcReq send: #bytes read=529
> >>> KdcAccessibility: remove localhost:50064
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos
> user:hdfs/EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
> >>KERBEROS
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:326 -
> dfs.client.use.legacy.blockreader.local = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:329 -
> dfs.client.read.shortcircuit = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 -
> dfs.client.domain.socket.data.traffic = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:335 - dfs.domain.socket.path =
> 2014-01-15 19:29:47 DEBUG MetricsSystemImpl:220 - StartupProgress,
> NameNode startup progress
> 2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry = null
> 2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=RPC_PROTOCOL_BUFFER,
> rpcRequestWrapperClass=class
> org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
> rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@379b0d9c
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit local
> reads and UNIX domain socket are disabled.
> 2014-01-15 19:29:47 DEBUG Shell:237 - Failed to detect a valid hadoop home
> directory
> java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
> at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:219)
> at org.apache.hadoop.util.Shell.<clinit>(Shell.java:244)
>  at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
> at
> org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1539)
>  at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:492)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:445)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2429)
>  at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2463)
>  at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2445)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363)
>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:165)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:38)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> 2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this
> machine. So not using it.
> 2014-01-15 19:29:47 DEBUG Shell:320 - setsid exited with exit code 0
> 2014-01-15 19:29:47 DEBUG Client:371 - The ping interval is 60000 ms.
> 2014-01-15 19:29:47 DEBUG Client:636 - Connecting to
> example.com/127.0.0.1:8020
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message state:
> NEGOTIATE
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:370 - Received SASL message state:
> NEGOTIATE
> auths {
>   method: "TOKEN"
>   mechanism: "DIGEST-MD5"
>   protocol: ""
>   serverId: "default"
>   challenge:
> "realm=\"default\",nonce=\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
> }
> auths {
>   method: "KERBEROS"
>   mechanism: "GSSAPI"
>   protocol: "nn"
>   serverId: "EXAMPLE.COM"
> }
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:259 - Get token info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.token.TokenInfo(value=class
> org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
> serverPrincipal=dfs.namenode.kerberos.principal)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerberos
> principal name for
> protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/
> EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL
> GSSAPI(KERBEROS)  client to authenticate to service at EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:172 - Use KERBEROS authentication
> for protocol ClientNamenodeProtocolPB
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Entered Krb5Context.initSecContext with state=STATE_NEW
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Service ticket not found in the subject
> >>> Credentials acquireServiceCreds: same realm
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=583
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=583
> >>>DEBUG: TCPClient reading 135 bytes
> >>> KrbKdcReq send: #bytes read=135
> >>> KdcAccessibility: remove localhost:50064
> >>> KDCRep: init() encoding tag is 126 req type is 13
> >>>KRBError:
>  sTime is Wed Jan 15 19:29:47 IST 2014 1389806987000
>  suSec is 0
>  error code is 7
>  error Message is Server not found in Kerberos database
>  realm is EXAMPLE.COM
>  sname is krbtgt/EXAMPLE.COM
>  msgType is 30
> 2014-01-15 19:29:47 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> >>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi
> >> Acquire default native Credentials
> >>> Obtained TGT from LSA: Credentials:
> *...Here it looks like it tries to get the machines kerberos tickets...*
> client=eleibovi@[snipped computer Kerberos server from /etc/krb5.conf]
> server=krbtgt/[snipped computer Kerberos server from /etc/krb5.conf]
> authTime=20140109035156Z
> startTime=20140109115308Z
> endTime=20140109215308Z
> renewTill=20140116035156Z
> flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
> EType (int): 18
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
> getKDCFromDNS using UDP
> getKDCFromDNS using TCP
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos
> user:null
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - using local
> user:UnixPrincipal: eleibovi
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:696 - UGI
> loginUser:eleibovi (auth:KERBEROS)
> 2014-01-15 19:30:18 WARN  Client:615 - Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:30:18 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:java.io.IOException:
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
> 2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to
> example.com/127.0.0.1:8020: javax.security.sasl.SaslException: GSS
> initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Server not found in Kerberos database (7) - Server not
> found in Kerberos database)]
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
>  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
>  at
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)
>  at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1318)
> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>  at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>  at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1636)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1117)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1113)
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:78)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1113)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:40)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
> by GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
>  at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> at
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
>  at
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
>  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
>  ... 32 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)
>  at
> sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
>  at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>  ... 41 more
> Caused by: KrbException: Server not found in Kerberos database (7) -
> Server not found in Kerberos database
> at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
>  at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192)
> at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203)
>  at
> sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
> at
> sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
>  at
> sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
> at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
>  ... 44 more
> Caused by: KrbException: Identifier doesn't match expected value (906)
> at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143)
>  at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66)
> at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:61)
>  at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
> ... 50 more
> 2014-01-15 19:30:18 DEBUG Client:1107 - IPC Client (23781497) connection
> to example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPLE.COM: closed
>

Re: "Server not found in Kerberos database" for MiniKDC server

Posted by Elazar Leibovich <el...@gmail.com>.
For the sake of completion.

The same settings worked in a Linux box.


On Wed, Jan 15, 2014 at 10:57 PM, Elazar Leibovich <el...@gmail.com>wrote:

> Hi,
>
> For educational purposes, I'm trying to set a minimal working secure
> Hadoop cluster on my machine.
>
> What I basically did is:
>
> Add example.com to /etc/hosts
>
> Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
> users - {nn,dn,hdfs}@EXAMPLE.COM
>
> Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
> for Mac OS X java:
>
> ❯ ~/hadoopconf env HADOOP_OPTS
> hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
> -Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> -Djava.net.preferIPv4Stack=true
>
>
> Set the proper hadoop configuration using the keytab and the hadoop users:
>
> ❯ ~/hadoopconf get --local
> hdfs-site.xml dfs.datanode.address            = example.com:1004
> core-site.xml fs.defaultFS                    = hdfs://example.com
> hdfs-site.xml dfs.namenode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.datanode.hostname           = example.com
> hdfs-site.xml dfs.datanode.kerberos.principal = dn/EXAMPLE.COM@EXAMPLE.COM
> hdfs-site.xml dfs.datanode.data.dir           =
> /tmp/hadoop-eleibovi/dfs/data
> hdfs-site.xml dfs.datanode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.namenode.kerberos.principal = nn/EXAMPLE.COM@EXAMPLE.COM
> core-site.xml hadoop.security.authorization   = true
> core-site.xml hadoop.security.authentication  = kerberos
> hdfs-site.xml dfs.datanode.dns.interface      = lo0
> hdfs-site.xml dfs.datanode.http.address       = example.com:1006
>
> Start the namenode service.
> $ ./bin/hdfs
> ...
> 14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
> 14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
> 127.0.0.1:8020
> 14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
> for active state
>
> Finally use the following short Java program to contact the namenode:
>
> System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
> UserGroupInformation.setConfiguration(conf);
>         UserGroupInformation ugi = UserGroupInformation.
>                 loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
> "/keytab");
>  ugi.doAs(new PrivilegedExceptionAction<Object>() {
>             @Override
>             public Object run() throws Exception {
>                 final FileSystem fs = FileSystem.get(conf);
>                 fs.getFileStatus(new Path("/"));
>          }
> }
>
> The exception I got is:
>
> Exception in thread "main" java.io.IOException: Failed on local exception:
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
> destination host is: "example.com":8020;
>
> I'll be glad to any help with debugging the problem.
>
> Thanks,
>
> I attach a full log with Kerberos debug turned on:
>
> args: [-conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
> -conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of successful kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of failed kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
> group related metrics
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> 2014-01-15 19:29:46 DEBUG Groups:180 -  Creating new Groups object
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the
> custom-built native-hadoop library...
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load
> native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
> java.library.path
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 -
> java.library.path=/Users/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
> 2014-01-15 19:29:46 WARN  NativeCodeLoader:62 - Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
> Falling back to shell based
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping
> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
> cacheTimeout=300000
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> >>> KdcAccessibility: reset
> >>> KdcAccessibility: reset
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
>  >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 59; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 75; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 55; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 71; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 23
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> KrbAsReq creating message
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=158
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=158
> >>>DEBUG: TCPClient reading 529 bytes
> >>> KrbKdcReq send: #bytes read=529
> >>> KdcAccessibility: remove localhost:50064
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos
> user:hdfs/EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
> >>KERBEROS
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:326 -
> dfs.client.use.legacy.blockreader.local = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:329 -
> dfs.client.read.shortcircuit = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 -
> dfs.client.domain.socket.data.traffic = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:335 - dfs.domain.socket.path =
> 2014-01-15 19:29:47 DEBUG MetricsSystemImpl:220 - StartupProgress,
> NameNode startup progress
> 2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry = null
> 2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=RPC_PROTOCOL_BUFFER,
> rpcRequestWrapperClass=class
> org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
> rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@379b0d9c
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit local
> reads and UNIX domain socket are disabled.
> 2014-01-15 19:29:47 DEBUG Shell:237 - Failed to detect a valid hadoop home
> directory
> java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
> at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:219)
> at org.apache.hadoop.util.Shell.<clinit>(Shell.java:244)
>  at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
> at
> org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1539)
>  at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:492)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:445)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2429)
>  at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2463)
>  at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2445)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363)
>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:165)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:38)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> 2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this
> machine. So not using it.
> 2014-01-15 19:29:47 DEBUG Shell:320 - setsid exited with exit code 0
> 2014-01-15 19:29:47 DEBUG Client:371 - The ping interval is 60000 ms.
> 2014-01-15 19:29:47 DEBUG Client:636 - Connecting to
> example.com/127.0.0.1:8020
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message state:
> NEGOTIATE
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:370 - Received SASL message state:
> NEGOTIATE
> auths {
>   method: "TOKEN"
>   mechanism: "DIGEST-MD5"
>   protocol: ""
>   serverId: "default"
>   challenge:
> "realm=\"default\",nonce=\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
> }
> auths {
>   method: "KERBEROS"
>   mechanism: "GSSAPI"
>   protocol: "nn"
>   serverId: "EXAMPLE.COM"
> }
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:259 - Get token info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.token.TokenInfo(value=class
> org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
> serverPrincipal=dfs.namenode.kerberos.principal)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerberos
> principal name for
> protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/
> EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL
> GSSAPI(KERBEROS)  client to authenticate to service at EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:172 - Use KERBEROS authentication
> for protocol ClientNamenodeProtocolPB
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Entered Krb5Context.initSecContext with state=STATE_NEW
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Service ticket not found in the subject
> >>> Credentials acquireServiceCreds: same realm
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=583
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=583
> >>>DEBUG: TCPClient reading 135 bytes
> >>> KrbKdcReq send: #bytes read=135
> >>> KdcAccessibility: remove localhost:50064
> >>> KDCRep: init() encoding tag is 126 req type is 13
> >>>KRBError:
>  sTime is Wed Jan 15 19:29:47 IST 2014 1389806987000
>  suSec is 0
>  error code is 7
>  error Message is Server not found in Kerberos database
>  realm is EXAMPLE.COM
>  sname is krbtgt/EXAMPLE.COM
>  msgType is 30
> 2014-01-15 19:29:47 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> >>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi
> >> Acquire default native Credentials
> >>> Obtained TGT from LSA: Credentials:
> *...Here it looks like it tries to get the machines kerberos tickets...*
> client=eleibovi@[snipped computer Kerberos server from /etc/krb5.conf]
> server=krbtgt/[snipped computer Kerberos server from /etc/krb5.conf]
> authTime=20140109035156Z
> startTime=20140109115308Z
> endTime=20140109215308Z
> renewTill=20140116035156Z
> flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
> EType (int): 18
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
> getKDCFromDNS using UDP
> getKDCFromDNS using TCP
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos
> user:null
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - using local
> user:UnixPrincipal: eleibovi
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:696 - UGI
> loginUser:eleibovi (auth:KERBEROS)
> 2014-01-15 19:30:18 WARN  Client:615 - Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:30:18 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:java.io.IOException:
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
> 2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to
> example.com/127.0.0.1:8020: javax.security.sasl.SaslException: GSS
> initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Server not found in Kerberos database (7) - Server not
> found in Kerberos database)]
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
>  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
>  at
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)
>  at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1318)
> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>  at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>  at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1636)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1117)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1113)
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:78)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1113)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:40)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
> by GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
>  at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> at
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
>  at
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
>  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
>  ... 32 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)
>  at
> sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
>  at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>  ... 41 more
> Caused by: KrbException: Server not found in Kerberos database (7) -
> Server not found in Kerberos database
> at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
>  at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192)
> at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203)
>  at
> sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
> at
> sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
>  at
> sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
> at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
>  ... 44 more
> Caused by: KrbException: Identifier doesn't match expected value (906)
> at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143)
>  at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66)
> at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:61)
>  at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
> ... 50 more
> 2014-01-15 19:30:18 DEBUG Client:1107 - IPC Client (23781497) connection
> to example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPLE.COM: closed
>

Re: "Server not found in Kerberos database" for MiniKDC server

Posted by Elazar Leibovich <el...@gmail.com>.
For the sake of completion.

The same settings worked in a Linux box.


On Wed, Jan 15, 2014 at 10:57 PM, Elazar Leibovich <el...@gmail.com>wrote:

> Hi,
>
> For educational purposes, I'm trying to set a minimal working secure
> Hadoop cluster on my machine.
>
> What I basically did is:
>
> Add example.com to /etc/hosts
>
> Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
> users - {nn,dn,hdfs}@EXAMPLE.COM
>
> Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
> for Mac OS X java:
>
> ❯ ~/hadoopconf env HADOOP_OPTS
> hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
> -Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> -Djava.net.preferIPv4Stack=true
>
>
> Set the proper hadoop configuration using the keytab and the hadoop users:
>
> ❯ ~/hadoopconf get --local
> hdfs-site.xml dfs.datanode.address            = example.com:1004
> core-site.xml fs.defaultFS                    = hdfs://example.com
> hdfs-site.xml dfs.namenode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.datanode.hostname           = example.com
> hdfs-site.xml dfs.datanode.kerberos.principal = dn/EXAMPLE.COM@EXAMPLE.COM
> hdfs-site.xml dfs.datanode.data.dir           =
> /tmp/hadoop-eleibovi/dfs/data
> hdfs-site.xml dfs.datanode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.namenode.kerberos.principal = nn/EXAMPLE.COM@EXAMPLE.COM
> core-site.xml hadoop.security.authorization   = true
> core-site.xml hadoop.security.authentication  = kerberos
> hdfs-site.xml dfs.datanode.dns.interface      = lo0
> hdfs-site.xml dfs.datanode.http.address       = example.com:1006
>
> Start the namenode service.
> $ ./bin/hdfs
> ...
> 14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
> 14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
> 127.0.0.1:8020
> 14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
> for active state
>
> Finally use the following short Java program to contact the namenode:
>
> System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
> UserGroupInformation.setConfiguration(conf);
>         UserGroupInformation ugi = UserGroupInformation.
>                 loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
> "/keytab");
>  ugi.doAs(new PrivilegedExceptionAction<Object>() {
>             @Override
>             public Object run() throws Exception {
>                 final FileSystem fs = FileSystem.get(conf);
>                 fs.getFileStatus(new Path("/"));
>          }
> }
>
> The exception I got is:
>
> Exception in thread "main" java.io.IOException: Failed on local exception:
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
> destination host is: "example.com":8020;
>
> I'll be glad to any help with debugging the problem.
>
> Thanks,
>
> I attach a full log with Kerberos debug turned on:
>
> args: [-conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
> -conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of successful kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of failed kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
> group related metrics
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> 2014-01-15 19:29:46 DEBUG Groups:180 -  Creating new Groups object
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the
> custom-built native-hadoop library...
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load
> native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
> java.library.path
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 -
> java.library.path=/Users/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
> 2014-01-15 19:29:46 WARN  NativeCodeLoader:62 - Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
> Falling back to shell based
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping
> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
> cacheTimeout=300000
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> >>> KdcAccessibility: reset
> >>> KdcAccessibility: reset
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
>  >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 59; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 75; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 55; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 71; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 23
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> KrbAsReq creating message
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=158
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=158
> >>>DEBUG: TCPClient reading 529 bytes
> >>> KrbKdcReq send: #bytes read=529
> >>> KdcAccessibility: remove localhost:50064
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos
> user:hdfs/EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
> >>KERBEROS
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:326 -
> dfs.client.use.legacy.blockreader.local = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:329 -
> dfs.client.read.shortcircuit = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 -
> dfs.client.domain.socket.data.traffic = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:335 - dfs.domain.socket.path =
> 2014-01-15 19:29:47 DEBUG MetricsSystemImpl:220 - StartupProgress,
> NameNode startup progress
> 2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry = null
> 2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=RPC_PROTOCOL_BUFFER,
> rpcRequestWrapperClass=class
> org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
> rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@379b0d9c
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit local
> reads and UNIX domain socket are disabled.
> 2014-01-15 19:29:47 DEBUG Shell:237 - Failed to detect a valid hadoop home
> directory
> java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
> at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:219)
> at org.apache.hadoop.util.Shell.<clinit>(Shell.java:244)
>  at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
> at
> org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1539)
>  at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:492)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:445)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2429)
>  at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2463)
>  at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2445)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363)
>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:165)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:38)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> 2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this
> machine. So not using it.
> 2014-01-15 19:29:47 DEBUG Shell:320 - setsid exited with exit code 0
> 2014-01-15 19:29:47 DEBUG Client:371 - The ping interval is 60000 ms.
> 2014-01-15 19:29:47 DEBUG Client:636 - Connecting to
> example.com/127.0.0.1:8020
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message state:
> NEGOTIATE
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:370 - Received SASL message state:
> NEGOTIATE
> auths {
>   method: "TOKEN"
>   mechanism: "DIGEST-MD5"
>   protocol: ""
>   serverId: "default"
>   challenge:
> "realm=\"default\",nonce=\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
> }
> auths {
>   method: "KERBEROS"
>   mechanism: "GSSAPI"
>   protocol: "nn"
>   serverId: "EXAMPLE.COM"
> }
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:259 - Get token info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.token.TokenInfo(value=class
> org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
> serverPrincipal=dfs.namenode.kerberos.principal)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerberos
> principal name for
> protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/
> EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL
> GSSAPI(KERBEROS)  client to authenticate to service at EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:172 - Use KERBEROS authentication
> for protocol ClientNamenodeProtocolPB
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Entered Krb5Context.initSecContext with state=STATE_NEW
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Service ticket not found in the subject
> >>> Credentials acquireServiceCreds: same realm
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=583
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=583
> >>>DEBUG: TCPClient reading 135 bytes
> >>> KrbKdcReq send: #bytes read=135
> >>> KdcAccessibility: remove localhost:50064
> >>> KDCRep: init() encoding tag is 126 req type is 13
> >>>KRBError:
>  sTime is Wed Jan 15 19:29:47 IST 2014 1389806987000
>  suSec is 0
>  error code is 7
>  error Message is Server not found in Kerberos database
>  realm is EXAMPLE.COM
>  sname is krbtgt/EXAMPLE.COM
>  msgType is 30
> 2014-01-15 19:29:47 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> >>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi
> >> Acquire default native Credentials
> >>> Obtained TGT from LSA: Credentials:
> *...Here it looks like it tries to get the machines kerberos tickets...*
> client=eleibovi@[snipped computer Kerberos server from /etc/krb5.conf]
> server=krbtgt/[snipped computer Kerberos server from /etc/krb5.conf]
> authTime=20140109035156Z
> startTime=20140109115308Z
> endTime=20140109215308Z
> renewTill=20140116035156Z
> flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
> EType (int): 18
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
> getKDCFromDNS using UDP
> getKDCFromDNS using TCP
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos
> user:null
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - using local
> user:UnixPrincipal: eleibovi
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:696 - UGI
> loginUser:eleibovi (auth:KERBEROS)
> 2014-01-15 19:30:18 WARN  Client:615 - Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:30:18 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:java.io.IOException:
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
> 2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to
> example.com/127.0.0.1:8020: javax.security.sasl.SaslException: GSS
> initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Server not found in Kerberos database (7) - Server not
> found in Kerberos database)]
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
>  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
>  at
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)
>  at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1318)
> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>  at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>  at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1636)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1117)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1113)
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:78)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1113)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:40)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
> by GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
>  at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> at
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
>  at
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
>  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
>  ... 32 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)
>  at
> sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
>  at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>  ... 41 more
> Caused by: KrbException: Server not found in Kerberos database (7) -
> Server not found in Kerberos database
> at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
>  at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192)
> at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203)
>  at
> sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
> at
> sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
>  at
> sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
> at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
>  ... 44 more
> Caused by: KrbException: Identifier doesn't match expected value (906)
> at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143)
>  at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66)
> at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:61)
>  at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
> ... 50 more
> 2014-01-15 19:30:18 DEBUG Client:1107 - IPC Client (23781497) connection
> to example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPLE.COM: closed
>

Re: "Server not found in Kerberos database" for MiniKDC server

Posted by Elazar Leibovich <el...@gmail.com>.
For the sake of completion.

The same settings worked in a Linux box.


On Wed, Jan 15, 2014 at 10:57 PM, Elazar Leibovich <el...@gmail.com>wrote:

> Hi,
>
> For educational purposes, I'm trying to set a minimal working secure
> Hadoop cluster on my machine.
>
> What I basically did is:
>
> Add example.com to /etc/hosts
>
> Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
> users - {nn,dn,hdfs}@EXAMPLE.COM
>
> Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
> for Mac OS X java:
>
> ❯ ~/hadoopconf env HADOOP_OPTS
> hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
> -Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> -Djava.net.preferIPv4Stack=true
>
>
> Set the proper hadoop configuration using the keytab and the hadoop users:
>
> ❯ ~/hadoopconf get --local
> hdfs-site.xml dfs.datanode.address            = example.com:1004
> core-site.xml fs.defaultFS                    = hdfs://example.com
> hdfs-site.xml dfs.namenode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.datanode.hostname           = example.com
> hdfs-site.xml dfs.datanode.kerberos.principal = dn/EXAMPLE.COM@EXAMPLE.COM
> hdfs-site.xml dfs.datanode.data.dir           =
> /tmp/hadoop-eleibovi/dfs/data
> hdfs-site.xml dfs.datanode.keytab.file        =
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.namenode.kerberos.principal = nn/EXAMPLE.COM@EXAMPLE.COM
> core-site.xml hadoop.security.authorization   = true
> core-site.xml hadoop.security.authentication  = kerberos
> hdfs-site.xml dfs.datanode.dns.interface      = lo0
> hdfs-site.xml dfs.datanode.http.address       = example.com:1006
>
> Start the namenode service.
> $ ./bin/hdfs
> ...
> 14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
> 14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
> 127.0.0.1:8020
> 14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
> for active state
>
> Finally use the following short Java program to contact the namenode:
>
> System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
> UserGroupInformation.setConfiguration(conf);
>         UserGroupInformation ugi = UserGroupInformation.
>                 loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
> "/keytab");
>  ugi.doAs(new PrivilegedExceptionAction<Object>() {
>             @Override
>             public Object run() throws Exception {
>                 final FileSystem fs = FileSystem.get(conf);
>                 fs.getFileStatus(new Path("/"));
>          }
> }
>
> The exception I got is:
>
> Exception in thread "main" java.io.IOException: Failed on local exception:
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
> destination host is: "example.com":8020;
>
> I'll be glad to any help with debugging the problem.
>
> Thanks,
>
> I attach a full log with Kerberos debug turned on:
>
> args: [-conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
> -conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of successful kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of failed kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
> group related metrics
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> 2014-01-15 19:29:46 DEBUG Groups:180 -  Creating new Groups object
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the
> custom-built native-hadoop library...
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load
> native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
> java.library.path
> 2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 -
> java.library.path=/Users/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
> 2014-01-15 19:29:46 WARN  NativeCodeLoader:62 - Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
> Falling back to shell based
> 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping
> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
> cacheTimeout=300000
> Java config name:
> /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> Loaded from Java config
> >>> KdcAccessibility: reset
> >>> KdcAccessibility: reset
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
>  >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): nn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 53; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 69; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): dn
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 61; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 59; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 75; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): eleibovi
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 67; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 55; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 71; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTab: load() entry length: 63; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): hdfs
> >>> KeyTab: load() entry length: 50; type: 23
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 42; type: 3
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 58; type: 16
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 17
> >>> KeyTabInputStream, readName(): EXAMPLE.COM
> >>> KeyTabInputStream, readName(): root
> >>> KeyTab: load() entry length: 50; type: 23
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> KrbAsReq creating message
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=158
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=158
> >>>DEBUG: TCPClient reading 529 bytes
> >>> KrbKdcReq send: #bytes read=529
> >>> KdcAccessibility: remove localhost:50064
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login
> Added key: 23version: 0
> Added key: 17version: 0
> Added key: 16version: 0
> Added key: 3version: 0
> Ordering keys wrt default_tkt_enctypes list
> Using builtin default etypes for default_tkt_enctypes
> default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos
> user:hdfs/EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
> >>KERBEROS
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:326 -
> dfs.client.use.legacy.blockreader.local = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:329 -
> dfs.client.read.shortcircuit = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 -
> dfs.client.domain.socket.data.traffic = false
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:335 - dfs.domain.socket.path =
> 2014-01-15 19:29:47 DEBUG MetricsSystemImpl:220 - StartupProgress,
> NameNode startup progress
> 2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry = null
> 2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=RPC_PROTOCOL_BUFFER,
> rpcRequestWrapperClass=class
> org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
> rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@379b0d9c
> 2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit local
> reads and UNIX domain socket are disabled.
> 2014-01-15 19:29:47 DEBUG Shell:237 - Failed to detect a valid hadoop home
> directory
> java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
> at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:219)
> at org.apache.hadoop.util.Shell.<clinit>(Shell.java:244)
>  at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
> at
> org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1539)
>  at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:492)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:445)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2429)
>  at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2463)
>  at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2445)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363)
>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:165)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:38)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> 2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this
> machine. So not using it.
> 2014-01-15 19:29:47 DEBUG Shell:320 - setsid exited with exit code 0
> 2014-01-15 19:29:47 DEBUG Client:371 - The ping interval is 60000 ms.
> 2014-01-15 19:29:47 DEBUG Client:636 - Connecting to
> example.com/127.0.0.1:8020
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message state:
> NEGOTIATE
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:370 - Received SASL message state:
> NEGOTIATE
> auths {
>   method: "TOKEN"
>   mechanism: "DIGEST-MD5"
>   protocol: ""
>   serverId: "default"
>   challenge:
> "realm=\"default\",nonce=\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
> }
> auths {
>   method: "KERBEROS"
>   mechanism: "GSSAPI"
>   protocol: "nn"
>   serverId: "EXAMPLE.COM"
> }
>
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:259 - Get token info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.token.TokenInfo(value=class
> org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
> serverPrincipal=dfs.namenode.kerberos.principal)
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerberos
> principal name for
> protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/
> EXAMPLE.COM@EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL
> GSSAPI(KERBEROS)  client to authenticate to service at EXAMPLE.COM
> 2014-01-15 19:29:47 DEBUG SaslRpcClient:172 - Use KERBEROS authentication
> for protocol ClientNamenodeProtocolPB
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Entered Krb5Context.initSecContext with state=STATE_NEW
> Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/
> EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014
> Service ticket not found in the subject
> >>> Credentials acquireServiceCreds: same realm
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
> >>> KrbKdcReq send: kdc=localhost TCP:50064, timeout=30000, number of
> retries =3, #bytes=583
> >>> KDCCommunication: kdc=localhost TCP:50064, timeout=30000,Attempt =1,
> #bytes=583
> >>>DEBUG: TCPClient reading 135 bytes
> >>> KrbKdcReq send: #bytes read=135
> >>> KdcAccessibility: remove localhost:50064
> >>> KDCRep: init() encoding tag is 126 req type is 13
> >>>KRBError:
>  sTime is Wed Jan 15 19:29:47 IST 2014 1389806987000
>  suSec is 0
>  error code is 7
>  error Message is Server not found in Kerberos database
>  realm is EXAMPLE.COM
>  sname is krbtgt/EXAMPLE.COM
>  msgType is 30
> 2014-01-15 19:29:47 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction
> as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> >>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi
> >> Acquire default native Credentials
> >>> Obtained TGT from LSA: Credentials:
> *...Here it looks like it tries to get the machines kerberos tickets...*
> client=eleibovi@[snipped computer Kerberos server from /etc/krb5.conf]
> server=krbtgt/[snipped computer Kerberos server from /etc/krb5.conf]
> authTime=20140109035156Z
> startTime=20140109115308Z
> endTime=20140109215308Z
> renewTill=20140116035156Z
> flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
> EType (int): 18
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType
> getKDCFromDNS using UDP
> getKDCFromDNS using TCP
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos
> user:null
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - using local
> user:UnixPrincipal: eleibovi
> 2014-01-15 19:30:18 DEBUG UserGroupInformation:696 - UGI
> loginUser:eleibovi (auth:KERBEROS)
> 2014-01-15 19:30:18 WARN  Client:615 - Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
> 2014-01-15 19:30:18 ERROR UserGroupInformation:1480 -
> PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS) cause:java.io.IOException:
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
> 2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to
> example.com/127.0.0.1:8020: javax.security.sasl.SaslException: GSS
> initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Server not found in Kerberos database (7) - Server not
> found in Kerberos database)]
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate
> failed [Caused by GSSException: No valid credentials provided (Mechanism
> level: Server not found in Kerberos database (7) - Server not found in
> Kerberos database)]
>  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
>  at
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)
>  at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1318)
> at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>  at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>  at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>  at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1636)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1117)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1113)
> at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:78)
>  at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1113)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:40)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>  at
> com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:606)
> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
> by GSSException: No valid credentials provided (Mechanism level: Server not
> found in Kerberos database (7) - Server not found in Kerberos database)]
>  at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> at
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:394)
>  at
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
> at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
>  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
>  ... 32 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)
>  at
> sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
>  at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>  ... 41 more
> Caused by: KrbException: Server not found in Kerberos database (7) -
> Server not found in Kerberos database
> at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
>  at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192)
> at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203)
>  at
> sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
> at
> sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
>  at
> sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
> at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
>  ... 44 more
> Caused by: KrbException: Identifier doesn't match expected value (906)
> at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143)
>  at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66)
> at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:61)
>  at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
> ... 50 more
> 2014-01-15 19:30:18 DEBUG Client:1107 - IPC Client (23781497) connection
> to example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPLE.COM: closed
>