You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Aneela Saleem <an...@platalytics.com> on 2016/07/28 13:05:58 UTC

issue starting regionserver with SASL authentication failed

Hi,

I have successfully configured Zookeeper with Kerberos authentication. Now
i'm facing issue while configuring HBase with Kerberos authentication. I
have followed this link
<http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html>.
Attached are the configuration files, i.e., hbase-site.xml and
zk-jaas.conf.

Following are the logs from regionserver:

016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
192.168.23.206:16020] regionserver.HRegionServer: error telling master we
are up
com.google.protobuf.ServiceException: java.io.IOException: Could not set up
IO Streams to hadoop-master/192.168.23.206:16000
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at
org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Could not set up IO Streams to
hadoop-master/192.168.23.206:16000
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
at
org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
... 5 more
Caused by: java.lang.RuntimeException: SASL authentication failed. The most
likely cause is missing or invalid credentials. Consider 'kinit'.
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
... 9 more
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
by GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at
org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at
org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
... 9 more
Caused by: GSSException: No valid credentials provided (Mechanism level:
Failed to find any Kerberos tgt)
at
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)


Please have a look, whats going wrong here?

Thanks

Re: issue starting regionserver with SASL authentication failed

Posted by Rakesh Radhakrishnan <ra...@apache.org>.
Hey Aneela,

I've filtered the below output from your log messages. It looks like you
have "/ranger" directory under the root directory and directory listing is
working fine.

*Found 1 items*
*drwxr-xr-x   - hdfs supergroup          0 2016-08-02 14:44 /ranger*

I think its putting all the log messages to the console because it may
missing log configurations and you may need to check your log configuration
in both Kerberos and Hadoop console client. Perhaps, you can refer
HADOOP_LOG_DIR section in https://wiki.apache.org/hadoop/HowToConfigure,
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/ClusterSetup.html#Logging.
Also, for Kerberos can try passing "-Dsun.security.krb5.debug=false"
when starting
the jvm.

Thanks,
Rakesh
Intel

On Tue, Aug 2, 2016 at 10:35 PM, Aneela Saleem <an...@platalytics.com>
wrote:

> Hi all,
>
> I'm facing issue starting region server in HBase. I have enabled Kerberos
> debugging in Hadoop command line, so when i run the "hadoop fs -ls /"
> command, i get following output, I can't interpret this. Can anyone please
> tell me is something wrong with Kerberos configuration or everything is
> fine ?
>
>
> 16/08/02 18:34:10 DEBUG util.Shell: setsid exited with exit code 0
> 16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL
> jar:file:/usr/local/hadoop/share/hadoop/common/hadoop-
> common-2.7.2.jar!/core-default.xml
> 16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream
> sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@4fbc7b65
> 16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL
> file:/usr/local/hadoop/etc/hadoop/core-site.xml
> 16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream
> java.io.BufferedInputStream@69c1adfa
> 16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.
> UserGroupInformation$UgiMetrics.loginSuccess with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
> of successful kerberos logins and latency (milliseconds)], about=,
> always=false, type=DEFAULT, sampleName=Ops)
> 16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.
> UserGroupInformation$UgiMetrics.loginFailure with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
> of failed kerberos logins and latency (milliseconds)], about=,
> always=false, type=DEFAULT, sampleName=Ops)
> 16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
> org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.
> UserGroupInformation$UgiMetrics.getGroups with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
> value=[GetGroups], about=, always=false, type=DEFAULT, sampleName=Ops)
> 16/08/02 18:34:11 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group
> related metrics
> Java config name: null
> Native config name: /etc/krb5.conf
> Loaded from native config
> 16/08/02 18:34:11 DEBUG security.Groups:  Creating new Groups object
> 16/08/02 18:34:11 DEBUG security.Groups: Group mapping
> impl=org.apache.hadoop.security.LdapGroupsMapping; cacheTimeout=300000;
> warningDeltaMs=5000
> >>>KinitOptions cache name is /tmp/krb5cc_0
> >>>DEBUG <CCacheInputStream>  client principal is nn/hadoop-master@
> platalyticsrealm
> >>>DEBUG <CCacheInputStream> server principal is krbtgt/platalyticsrealm@
> platalyticsrealm
> >>>DEBUG <CCacheInputStream> key type: 16
> >>>DEBUG <CCacheInputStream> auth time: Tue Aug 02 18:23:59 PKT 2016
> >>>DEBUG <CCacheInputStream> start time: Tue Aug 02 18:23:59 PKT 2016
> >>>DEBUG <CCacheInputStream> end time: Wed Aug 03 06:23:59 PKT 2016
> >>>DEBUG <CCacheInputStream> renew_till time: Tue Aug 09 18:23:59 PKT 2016
> >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
> >>>DEBUG <CCacheInputStream>  client principal is nn/hadoop-master@
> platalyticsrealm
> >>>DEBUG <CCacheInputStream> server principal is
> X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/platalyticsrealm@
> platalyticsrealm
> >>>DEBUG <CCacheInputStream> key type: 0
> >>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 05:00:00 PKT 1970
> >>>DEBUG <CCacheInputStream> start time: null
> >>>DEBUG <CCacheInputStream> end time: Thu Jan 01 05:00:00 PKT 1970
> >>>DEBUG <CCacheInputStream> renew_till time: null
> >>> CCacheInputStream: readFlags()
> 16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login
> 16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login commit
> 16/08/02 18:34:11 DEBUG security.UserGroupInformation: using kerberos
> user:nn/hadoop-master@platalyticsrealm
> 16/08/02 18:34:11 DEBUG security.UserGroupInformation: Using user:
> "nn/hadoop-master@platalyticsrealm" with name nn/hadoop-master@
> platalyticsrealm
> 16/08/02 18:34:11 DEBUG security.UserGroupInformation: User entry:
> "nn/hadoop-master@platalyticsrealm"
> 16/08/02 18:34:11 DEBUG security.UserGroupInformation: UGI
> loginUser:nn/hadoop-master@platalyticsrealm (auth:KERBEROS)
> 16/08/02 18:34:12 DEBUG security.UserGroupInformation: Found tgt Ticket
> (hex) =
> 0000: 61 82 01 72 30 82 01 6E   A0 03 02 01 05 A1 12 1B  a..r0..n........
> 0010: 10 70 6C 61 74 61 6C 79   74 69 63 73 72 65 61 6C  .platalyticsreal
> 0020: 6D A2 25 30 23 A0 03 02   01 02 A1 1C 30 1A 1B 06  m.%0#.......0...
> 0030: 6B 72 62 74 67 74 1B 10   70 6C 61 74 61 6C 79 74  krbtgt..platalyt
> 0040: 69 63 73 72 65 61 6C 6D   A3 82 01 2A 30 82 01 26  icsrealm...*0..&
> 0050: A0 03 02 01 10 A1 03 02   01 01 A2 82 01 18 04 82  ................
> 0060: 01 14 A5 A9 41 A6 B7 0E   8F 70 F4 03 41 64 8D DC  ....A....p..Ad..
> 0070: 78 2F FB 08 58 C9 39 44   CF D0 8D B0 85 09 62 8C  x/..X.9D......b.
> 0080: 40 CF 45 13 D3 B9 CD 38   84 92 33 24 B2 0D C1 65  @.E....8..3$...e
> 0090: C7 1B 0D 3E F2 92 A2 8B   58 34 77 5F F6 E3 AA B6  ...>....X4w_....
> 00A0: EB 8E 58 46 AC 54 DB 9B   79 3E ED A1 83 0C D3 D3  ..XF.T..y>......
> 00B0: 02 8B 42 52 6D 92 F1 39   BA E7 56 D4 BA A6 03 B6  ..BRm..9..V.....
> 00C0: 16 5A DC 1A 69 F4 DF A5   CD F6 48 AC 08 32 D3 AD  .Z..i.....H..2..
> 00D0: 22 8E E9 52 00 93 78 41   1C 26 4F 0B 42 2C EF E9  "..R..xA.&O.B,..
> 00E0: B8 0E 84 39 E4 AF 3A 60   7D 04 EE 70 18 C0 E7 21  ...9..:`...p...!
> 00F0: 0B 70 18 42 33 5E D9 CA   94 C0 6F 6A C0 39 72 7B  .p.B3^....oj.9r.
> 0100: FD 6E F1 09 CE 2D 02 EA   DA 52 5C 1B B2 18 36 0E  .n...-...R\...6.
> 0110: 54 94 DD 7A 47 A8 F2 36   53 18 3D D7 5C 68 58 71  T..zG..6S.=.\hXq
> 0120: 63 DB 36 88 B9 87 62 DC   BA 86 C3 F0 55 05 D8 15  c.6...b.....U...
> 0130: 6E 70 FD 8E 64 63 3D 51   36 EC 9E 63 30 77 BE 98  np..dc=Q6..c0w..
> 0140: 1D A0 DC 97 04 6F 03 AB   12 52 F8 68 7C 6C D0 88  .....o...R.h.l..
> 0150: 16 FC 17 69 3E 02 4B 59   E8 22 B3 1B 13 70 B2 6A  ...i>.KY."...p.j
> 0160: 3F 05 3B 1C 91 3D 03 A8   30 64 1C B1 59 42 17 FB  ?.;..=..0d..YB..
> 0170: 1B B2 76 E0 BC 49                                  ..v..I
>
> Client Principal = nn/hadoop-master@platalyticsrealm
> Server Principal = krbtgt/platalyticsrealm@platalyticsrealm
> Session Key = EncryptionKey: keyType=16 keyBytes (hex dump)=
> 0000: B5 4A 9B 0E 1C 6D 1C 34   D5 DF DA F2 9D 4C C2 FE  .J...m.4.....L..
> 0010: D9 0D 67 A2 79 6D 8C 0D                            ..g.ym..
>
>
> Forwardable Ticket true
> Forwarded Ticket false
> Proxiable Ticket false
> Proxy Ticket false
> Postdated Ticket false
> Renewable Ticket true
> Initial Ticket true
> Auth Time = Tue Aug 02 18:23:59 PKT 2016
> Start Time = Tue Aug 02 18:23:59 PKT 2016
> End Time = Wed Aug 03 06:23:59 PKT 2016
> Renew Till = Tue Aug 09 18:23:59 PKT 2016
> Client Addresses  Null
> 16/08/02 18:34:12 DEBUG security.UserGroupInformation: Current time is
> 1470144852023
> 16/08/02 18:34:12 DEBUG security.UserGroupInformation: Next refresh is
> 1470178799000
> 16/08/02 18:34:12 TRACE tracing.SpanReceiverHost: No span receiver names
> found in dfs.client.htrace.spanreceiver.classes.
> 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local
> = false
> 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal:
> dfs.client.read.shortcircuit = false
> 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal:
> dfs.client.domain.socket.data.traffic = false
> 16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
> 16/08/02 18:34:12 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
> 16/08/02 18:34:12 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER,
> rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
> rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$
> ProtoBufRpcInvoker@4219a40f
> 16/08/02 18:34:12 DEBUG ipc.Client: getting client out of cache:
> org.apache.hadoop.ipc.Client@5e0df7af
> 16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Trying to load the
> custom-built native-hadoop library...
> 16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Loaded the native-hadoop
> library
> 16/08/02 18:34:13 DEBUG unix.DomainSocketWatcher:
> org.apache.hadoop.net.unix.DomainSocketWatcher$2@1a1ff7d1: starting with
> interruptCheckPeriodMs = 60000
> 16/08/02 18:34:13 TRACE unix.DomainSocketWatcher: DomainSocketWatcher(1934811148):
> adding notificationSocket 191, connected to 190
> 16/08/02 18:34:13 DEBUG util.PerformanceAdvisory: Both short-circuit local
> reads and UNIX domain socket are disabled.
> 16/08/02 18:34:13 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol
> not using SaslPropertiesResolver, no QOP found in configuration for
> dfs.data.transfer.protection
> 16/08/02 18:34:13 TRACE ipc.ProtobufRpcEngine: 1: Call -> /
> 192.168.23.206:8020: getFileInfo {src: "/"}
> 16/08/02 18:34:13 DEBUG ipc.Client: The ping interval is 60000 ms.
> 16/08/02 18:34:13 DEBUG ipc.Client: Connecting to /192.168.23.206:8020
> 16/08/02 18:34:13 DEBUG security.UserGroupInformation: PrivilegedAction
> as:nn/hadoop-master@platalyticsrealm (auth:KERBEROS)
> from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:
> 724)
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message
> state: NEGOTIATE
>
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
> state: NEGOTIATE
> auths {
>   method: "TOKEN"
>   mechanism: "DIGEST-MD5"
>   protocol: ""
>   serverId: "default"
>   challenge: "realm=\"default\",nonce=\"xHi0jI3ZHzKXd2aQ0Gqx4N1qcgbdJA
> WBCa36ZeSO\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
> }
> auths {
>   method: "KERBEROS"
>   mechanism: "GSSAPI"
>   protocol: "nn"
>   serverId: "hadoop-master"
> }
>
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get token info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.token.TokenInfo(value=class
> org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get kerberos info
> proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
> serverPrincipal=dfs.namenode.kerberos.principal)
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: RPC Server's Kerberos
> principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
> is nn/hadoop-master@platalyticsrealm
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Creating SASL
> GSSAPI(KERBEROS)  client to authenticate to service at hadoop-master
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Use KERBEROS
> authentication for protocol ClientNamenodeProtocolPB
> Found ticket for nn/hadoop-master@platalyticsrealm to go to
> krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59
> PKT 2016
> Entered Krb5Context.initSecContext with state=STATE_NEW
> Found ticket for nn/hadoop-master@platalyticsrealm to go to
> krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59
> PKT 2016
> Service ticket not found in the subject
> >>> Credentials acquireServiceCreds: same realm
> Using builtin default etypes for default_tgs_enctypes
> default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
> >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
> >>> KdcAccessibility: reset
> >>> KrbKdcReq send: kdc=platalytics.com UDP:88, timeout=30000, number of
> retries =3, #bytes=727
> >>> KDCCommunication: kdc=platalytics.com UDP:88, timeout=30000,Attempt
> =1, #bytes=727
> >>> KrbKdcReq send: #bytes read=686
> >>> KdcAccessibility: remove platalytics.com
> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
> >>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000
> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
> Krb5Context setting mySeqNumber to: 822249937
> Created InitSecContextToken:
> 0000: 01 00 6E 82 02 67 30 82   02 63 A0 03 02 01 05 A1  ..n..g0..c......
> 0010: 03 02 01 0E A2 07 03 05   00 20 00 00 00 A3 82 01  ......... ......
> 0020: 6F 61 82 01 6B 30 82 01   67 A0 03 02 01 05 A1 12  oa..k0..g.......
> 0030: 1B 10 70 6C 61 74 61 6C   79 74 69 63 73 72 65 61  ..platalyticsrea
> 0040: 6C 6D A2 1E 30 1C A0 03   02 01 00 A1 15 30 13 1B  lm..0........0..
> 0050: 02 6E 6E 1B 0D 68 61 64   6F 6F 70 2D 6D 61 73 74  .nn..hadoop-mast
> 0060: 65 72 A3 82 01 2A 30 82   01 26 A0 03 02 01 10 A1  er...*0..&......
> 0070: 03 02 01 04 A2 82 01 18   04 82 01 14 25 56 29 BE  ............%V).
> 0080: 2E AA 50 55 7B 2C 5C AC   BA 64 2D 4D 8D 9C 71 B1  ..PU.,\..d-M..q.
> 0090: 1A 99 14 81 4C 98 80 B2   65 86 6C 37 61 67 31 D1  ....L...e.l7ag1.
> 00A0: 6F F6 E7 7A F3 92 A5 9A   F0 BA A5 BE 1C 15 7F 14  o..z............
> 00B0: 85 7E B0 7A 81 3D 9C B6   00 80 43 00 2A 0C 89 6A  ...z.=....C.*..j
> 00C0: B1 49 EF 27 F9 97 A1 3E   5C 80 B7 0D 49 6C E0 A3  .I.'...>\...Il..
> 00D0: 73 BC C2 69 AE 92 88 26   C5 DA FD 6E AB 55 F7 60  s..i...&...n.U.`
> 00E0: D0 7E 3A A5 5D 78 4E 3F   3D 96 44 6B B9 8F EA D8  ..:.]xN?=.Dk....
> 00F0: 4E BA 70 F3 5C 25 4E ED   AD E2 76 09 FF 36 D8 6D  N.p.\%N...v..6.m
> 0100: A4 22 C3 93 10 04 04 F2   6C D4 04 C9 A9 14 95 47  ."......l......G
> 0110: 16 BA 62 6F 58 5F 4F 8E   38 23 A5 5C 1D 58 F8 D5  ..boX_O.8#.\.X..
> 0120: 87 23 3D 7F 0B A7 BE 18   25 1F F1 7B 4C 54 EC BD  .#=.....%...LT..
> 0130: A6 D4 05 4C 82 03 64 FD   5A 4E 24 D8 71 D5 5A 15  ...L..d.ZN$.q.Z.
> 0140: 4C 2E E3 12 88 19 19 09   C1 F9 31 9D 6E CE D4 6F  L.........1.n..o
> 0150: 7A 20 F6 82 BB F6 28 D1   ED A3 54 69 01 9E A4 4C  z ....(...Ti...L
> 0160: 40 E2 E0 FC F5 35 44 C1   25 8C 50 1F C0 01 1D C0  @....5D.%.P.....
> 0170: 63 A5 45 B8 56 DF F7 F8   CA 86 8B 96 0C 5C 49 EA  c.E.V........\I.
> 0180: F0 A9 70 9C 2E 0E 36 57   65 47 97 09 8C 24 F1 00  ..p...6WeG...$..
> 0190: A4 81 DA 30 81 D7 A0 03   02 01 10 A2 81 CF 04 81  ...0............
> 01A0: CC F1 F6 BE 3A A7 C0 1A   04 D0 72 DE 57 94 D1 FE  ....:.....r.W...
> 01B0: 16 7E E8 09 72 D7 83 54   B3 1C 98 59 36 86 78 12  ....r..T...Y6.x.
> 01C0: A5 02 E3 B6 8C C6 83 B5   C9 7C 53 A3 C9 79 AF C8  ..........S..y..
> 01D0: B8 1A B3 B2 A6 7E 02 1A   A5 9C 41 EA 08 87 A8 E5  ..........A.....
> 01E0: D1 0E ED 69 5C CA 33 63   24 C8 4B E1 57 D5 C3 AF  ...i\.3c$.K.W...
> 01F0: 39 0A DE F6 9F 63 3B 44   79 5B 29 F7 9A B0 2E 8B  9....c;Dy[).....
> 0200: 1C EF 4A 0B D9 3A 55 75   C5 38 B7 5C 50 11 0E 74  ..J..:Uu.8.\P..t
> 0210: BE 57 DC 70 30 DD AF 14   35 97 1C 14 11 70 46 FD  .W.p0...5....pF.
> 0220: F9 8C 14 60 DE 35 D8 DC   81 86 C7 31 1F F8 6A 65  ...`.5.....1..je
> 0230: 2D B7 8A EF F2 61 21 00   2C 8D 4F 3A 49 1E 24 80  -....a!.,.O:I.$.
> 0240: FA 56 D0 2D 0E 52 AE 29   2B 6A 4A C7 16 8F B5 D8  .V.-.R.)+jJ.....
> 0250: EC 41 18 03 34 F2 D8 94   79 82 C8 0D E2 10 72 39  .A..4...y.....r9
> 0260: 85 B9 F7 BB 54 5C 71 21   49 23 A5 4A D0           ....T\q!I#.J.
>
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message
> state: INITIATE
> token: "`\202\002x\006\t*\206H\206\367\022\001\002\002\001\000n\
> 202\002g0\202\002c\240\003\002\001\005\241\003\002\001\016\242\a\003\005\000
> \000\000\000\243\202\001oa\202\001k0\202\001g\240\003\
> 002\001\005\241\022\033\020platalyticsrealm\242\0360\
> 034\240\003\002\001\000\241\0250\023\033\002nn\033\
> rhadoop-master\243\202\001*0\202\001&\240\003\002\001\020\
> 241\003\002\001\004\242\202\001\030\004\202\001\024%V)\
> 276.\252PU{,\\\254\272d-M\215\234q\261\032\231\024\201L\230\
> 200\262e\206l7ag1\321o\366\347z\363\222\245\232\360\272\245\276\034\025
> \024\205~\260z\201=\234\266\000\200C\000*\f\211j\261I\357\
> '\371\227\241>\\\200\267\rIl\340\243s\274\302i\256\222\210&
> \305\332\375n\253U\367`\320~:\245]xN?=\226Dk\271\217\352\
> 330N\272p\363\\%N\355\255\342v\t\3776\330m\244\"\303\
> 223\020\004\004\362l\324\004\311\251\024\225G\026\272boX_O\2168#\245\\\035X\370\325\207#=
> \v\247\276\030%\037\361{LT\354\275\246\324\005L\202\003d\
> 375ZN$\330q\325Z\025L.\343\022\210\031\031\t\301\3711\235n\316\324oz
> \366\202\273\366(\321\355\243Ti\001\236\244L@\342\340\
> 374\3655D\301%\214P\037\300\001\035\300c\245E\270V\337\
> 367\370\312\206\213\226\f\\I\352\360\251p\234.\0166WeG\227\
> t\214$\361\000\244\201\3320\201\327\240\003\002\001\020\
> 242\201\317\004\201\314\361\366\276:\247\300\032\004\320r\
> 336W\224\321\376\026~\350\tr\327\203T\263\034\230Y6\206x\
> 022\245\002\343\266\214\306\203\265\311|S\243\311y\257\
> 310\270\032\263\262\246~\002\032\245\234A\352\b\207\250\
> 345\321\016\355i\\\3123c$\310K\341W\325\303\2579\n\336\
> 366\237c;Dy[)\367\232\260.\213\034\357J\v\331:Uu\3058\
> 267\\P\021\016t\276W\334p0\335\257\0245\227\034\024\
> 021pF\375\371\214\024`\3365\330\334\201\206\3071\037\
> 370je-\267\212\357\362a!\000,\215O:I\036$\200\372V\320-\
> 016R\256)+jJ\307\026\217\265\330\354A\030\0034\362\330\
> 224y\202\310\r\342\020r9\205\271\367\273T\\q!I#\245J\320"
> auths {
>   method: "KERBEROS"
>   mechanism: "GSSAPI"
>   protocol: "nn"
>   serverId: "hadoop-master"
> }
>
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
> state: CHALLENGE
> token: "`l\006\t*\206H\206\367\022\001\002\002\002\000o]0[\240\
> 003\002\001\005\241\003\002\001\017\242O0M\240\003\002\
> 001\020\242F\004D\337\316\251\336\365\261O@\377
> \"\035\203\002\357Z\231e\332\357\364\204>d\325\"\340\263\
> 2302\031\277\023G\342=\355\334)\303\271\t\376\252\225\
> 207\033\000\243\332\252\335{\"\033\025 \fW\225\300\375\272\201\367\
> 216\371\273"
>
> Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS
> >>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
> Krb5Context setting peerSeqNumber to: 766454664
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message
> state: RESPONSE
> token: ""
>
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
> state: CHALLENGE
> token: "`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\377\272
> \237\354\300\003\367{\207A\267\371\245\327\374\333\021\
> 026\375}\353\035\254\327\305\272\373\305\365L\022\374.A\
> 203\002\001\001\000\000\004\004\004\004"
>
> Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04
> 00 ff ff ff ff ba 20 9f ec c0 03 f7 7b 87 41 b7 f9 a5 d7 fc db 11 16 fd 7d
> eb 1d ac d7 c5 ba fb c5 f5 4c 12 fc 2e 41 83 02 01 01 00 00 04 04 04 04 ]
> Krb5Context.unwrap: data=[01 01 00 00 ]
> Krb5Context.wrap: data=[01 01 00 00 ]
> Krb5Context.wrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04
> 00 ff ff ff ff 33 b9 e5 96 b6 c8 d3 80 4f 8a a1 5b 44 c9 b6 76 ea fe ec 80
> be 37 12 e1 04 cc e5 0f 2a f8 16 1b 9e 72 17 dc 01 01 00 00 04 04 04 04 ]
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message
> state: RESPONSE
> token: "`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\
> 377\377\377\3773\271\345\226\266\310\323\200O\212\241[D\
> 311\266v\352\376\354\200\2767\022\341\004\314\345\017*\370\
> 026\033\236r\027\334\001\001\000\000\004\004\004\004"
>
> 16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
> state: SUCCESS
>
> 16/08/02 18:34:13 DEBUG ipc.Client: Negotiated QOP is :auth
> 16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: starting,
> having connections 1
> 16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #0
> 16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #0
> 16/08/02 18:34:13 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 594ms
> 16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- /
> 192.168.23.206:8020: getFileInfo {fs { fileType: IS_DIR path: "" length:
> 0 permission { perm: 493 } owner: "hdfs" group: "supergroup"
> modification_time: 1470131070337 access_time: 0 block_replication: 0
> blocksize: 0 fileId: 16385 childrenNum: 1 storagePolicy: 0 }}
> 16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Call -> /
> 192.168.23.206:8020: getListing {src: "/" startAfter: "" needLocation:
> false}
> 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #1
> 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #1
> 16/08/02 18:34:14 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 7ms
> 16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- /
> 192.168.23.206:8020: getListing {dirList { partialListing { fileType:
> IS_DIR path: "ranger" length: 0 permission { perm: 493 } owner: "hdfs"
> group: "supergroup" modification_time: 1470131070364 access_time: 0
> block_replication: 0 blocksize: 0 fileId: 16386 childrenNum: 1
> storagePolicy: 0 } remainingEntries: 0 }}
> *Found 1 items*
> *drwxr-xr-x   - hdfs supergroup          0 2016-08-02 14:44 /ranger*
> 16/08/02 18:34:14 DEBUG ipc.Client: stopping client from cache:
> org.apache.hadoop.ipc.Client@5e0df7af
> 16/08/02 18:34:14 DEBUG ipc.Client: removing client from cache:
> org.apache.hadoop.ipc.Client@5e0df7af
> 16/08/02 18:34:14 DEBUG ipc.Client: stopping actual client because no more
> references remain: org.apache.hadoop.ipc.Client@5e0df7af
> 16/08/02 18:34:14 DEBUG ipc.Client: Stopping client
> 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: closed
> 16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
> 192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: stopped,
> remaining connections 0
>
> Thanks
>

issue starting regionserver with SASL authentication failed

Posted by Aneela Saleem <an...@platalytics.com>.
Hi all,

I'm facing issue starting region server in HBase. I have enabled Kerberos
debugging in Hadoop command line, so when i run the "hadoop fs -ls /"
command, i get following output, I can't interpret this. Can anyone please
tell me is something wrong with Kerberos configuration or everything is
fine ?


16/08/02 18:34:10 DEBUG util.Shell: setsid exited with exit code 0
16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL
jar:file:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.7.2.jar!/core-default.xml
16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream
sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@4fbc7b65
16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL
file:/usr/local/hadoop/etc/hadoop/core-site.xml
16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream
java.io.BufferedInputStream@69c1adfa
16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
of successful kerberos logins and latency (milliseconds)], about=,
always=false, type=DEFAULT, sampleName=Ops)
16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
of failed kerberos logins and latency (milliseconds)], about=,
always=false, type=DEFAULT, sampleName=Ops)
16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with
annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
value=[GetGroups], about=, always=false, type=DEFAULT, sampleName=Ops)
16/08/02 18:34:11 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group
related metrics
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
16/08/02 18:34:11 DEBUG security.Groups:  Creating new Groups object
16/08/02 18:34:11 DEBUG security.Groups: Group mapping
impl=org.apache.hadoop.security.LdapGroupsMapping; cacheTimeout=300000;
warningDeltaMs=5000
>>>KinitOptions cache name is /tmp/krb5cc_0
>>>DEBUG <CCacheInputStream>  client principal is
nn/hadoop-master@platalyticsrealm
>>>DEBUG <CCacheInputStream> server principal is
krbtgt/platalyticsrealm@platalyticsrealm
>>>DEBUG <CCacheInputStream> key type: 16
>>>DEBUG <CCacheInputStream> auth time: Tue Aug 02 18:23:59 PKT 2016
>>>DEBUG <CCacheInputStream> start time: Tue Aug 02 18:23:59 PKT 2016
>>>DEBUG <CCacheInputStream> end time: Wed Aug 03 06:23:59 PKT 2016
>>>DEBUG <CCacheInputStream> renew_till time: Tue Aug 09 18:23:59 PKT 2016
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>DEBUG <CCacheInputStream>  client principal is
nn/hadoop-master@platalyticsrealm
>>>DEBUG <CCacheInputStream> server principal is
X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/platalyticsrealm@platalyticsrealm
>>>DEBUG <CCacheInputStream> key type: 0
>>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 05:00:00 PKT 1970
>>>DEBUG <CCacheInputStream> start time: null
>>>DEBUG <CCacheInputStream> end time: Thu Jan 01 05:00:00 PKT 1970
>>>DEBUG <CCacheInputStream> renew_till time: null
>>> CCacheInputStream: readFlags()
16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login
16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login commit
16/08/02 18:34:11 DEBUG security.UserGroupInformation: using kerberos
user:nn/hadoop-master@platalyticsrealm
16/08/02 18:34:11 DEBUG security.UserGroupInformation: Using user:
"nn/hadoop-master@platalyticsrealm" with name
nn/hadoop-master@platalyticsrealm
16/08/02 18:34:11 DEBUG security.UserGroupInformation: User entry:
"nn/hadoop-master@platalyticsrealm"
16/08/02 18:34:11 DEBUG security.UserGroupInformation: UGI
loginUser:nn/hadoop-master@platalyticsrealm (auth:KERBEROS)
16/08/02 18:34:12 DEBUG security.UserGroupInformation: Found tgt Ticket
(hex) =
0000: 61 82 01 72 30 82 01 6E   A0 03 02 01 05 A1 12 1B  a..r0..n........
0010: 10 70 6C 61 74 61 6C 79   74 69 63 73 72 65 61 6C  .platalyticsreal
0020: 6D A2 25 30 23 A0 03 02   01 02 A1 1C 30 1A 1B 06  m.%0#.......0...
0030: 6B 72 62 74 67 74 1B 10   70 6C 61 74 61 6C 79 74  krbtgt..platalyt
0040: 69 63 73 72 65 61 6C 6D   A3 82 01 2A 30 82 01 26  icsrealm...*0..&
0050: A0 03 02 01 10 A1 03 02   01 01 A2 82 01 18 04 82  ................
0060: 01 14 A5 A9 41 A6 B7 0E   8F 70 F4 03 41 64 8D DC  ....A....p..Ad..
0070: 78 2F FB 08 58 C9 39 44   CF D0 8D B0 85 09 62 8C  x/..X.9D......b.
0080: 40 CF 45 13 D3 B9 CD 38   84 92 33 24 B2 0D C1 65  @.E....8..3$...e
0090: C7 1B 0D 3E F2 92 A2 8B   58 34 77 5F F6 E3 AA B6  ...>....X4w_....
00A0: EB 8E 58 46 AC 54 DB 9B   79 3E ED A1 83 0C D3 D3  ..XF.T..y>......
00B0: 02 8B 42 52 6D 92 F1 39   BA E7 56 D4 BA A6 03 B6  ..BRm..9..V.....
00C0: 16 5A DC 1A 69 F4 DF A5   CD F6 48 AC 08 32 D3 AD  .Z..i.....H..2..
00D0: 22 8E E9 52 00 93 78 41   1C 26 4F 0B 42 2C EF E9  "..R..xA.&O.B,..
00E0: B8 0E 84 39 E4 AF 3A 60   7D 04 EE 70 18 C0 E7 21  ...9..:`...p...!
00F0: 0B 70 18 42 33 5E D9 CA   94 C0 6F 6A C0 39 72 7B  .p.B3^....oj.9r.
0100: FD 6E F1 09 CE 2D 02 EA   DA 52 5C 1B B2 18 36 0E  .n...-...R\...6.
0110: 54 94 DD 7A 47 A8 F2 36   53 18 3D D7 5C 68 58 71  T..zG..6S.=.\hXq
0120: 63 DB 36 88 B9 87 62 DC   BA 86 C3 F0 55 05 D8 15  c.6...b.....U...
0130: 6E 70 FD 8E 64 63 3D 51   36 EC 9E 63 30 77 BE 98  np..dc=Q6..c0w..
0140: 1D A0 DC 97 04 6F 03 AB   12 52 F8 68 7C 6C D0 88  .....o...R.h.l..
0150: 16 FC 17 69 3E 02 4B 59   E8 22 B3 1B 13 70 B2 6A  ...i>.KY."...p.j
0160: 3F 05 3B 1C 91 3D 03 A8   30 64 1C B1 59 42 17 FB  ?.;..=..0d..YB..
0170: 1B B2 76 E0 BC 49                                  ..v..I

Client Principal = nn/hadoop-master@platalyticsrealm
Server Principal = krbtgt/platalyticsrealm@platalyticsrealm
Session Key = EncryptionKey: keyType=16 keyBytes (hex dump)=
0000: B5 4A 9B 0E 1C 6D 1C 34   D5 DF DA F2 9D 4C C2 FE  .J...m.4.....L..
0010: D9 0D 67 A2 79 6D 8C 0D                            ..g.ym..


Forwardable Ticket true
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket true
Initial Ticket true
Auth Time = Tue Aug 02 18:23:59 PKT 2016
Start Time = Tue Aug 02 18:23:59 PKT 2016
End Time = Wed Aug 03 06:23:59 PKT 2016
Renew Till = Tue Aug 09 18:23:59 PKT 2016
Client Addresses  Null
16/08/02 18:34:12 DEBUG security.UserGroupInformation: Current time is
1470144852023
16/08/02 18:34:12 DEBUG security.UserGroupInformation: Next refresh is
1470178799000
16/08/02 18:34:12 TRACE tracing.SpanReceiverHost: No span receiver names
found in dfs.client.htrace.spanreceiver.classes.
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal:
dfs.client.use.legacy.blockreader.local = false
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit
= false
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal:
dfs.client.domain.socket.data.traffic = false
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
16/08/02 18:34:12 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
16/08/02 18:34:12 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER,
rpcRequestWrapperClass=class
org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4219a40f
16/08/02 18:34:12 DEBUG ipc.Client: getting client out of cache:
org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Trying to load the
custom-built native-hadoop library...
16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Loaded the native-hadoop
library
16/08/02 18:34:13 DEBUG unix.DomainSocketWatcher:
org.apache.hadoop.net.unix.DomainSocketWatcher$2@1a1ff7d1: starting with
interruptCheckPeriodMs = 60000
16/08/02 18:34:13 TRACE unix.DomainSocketWatcher:
DomainSocketWatcher(1934811148): adding notificationSocket 191, connected
to 190
16/08/02 18:34:13 DEBUG util.PerformanceAdvisory: Both short-circuit local
reads and UNIX domain socket are disabled.
16/08/02 18:34:13 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not
using SaslPropertiesResolver, no QOP found in configuration for
dfs.data.transfer.protection
16/08/02 18:34:13 TRACE ipc.ProtobufRpcEngine: 1: Call -> /
192.168.23.206:8020: getFileInfo {src: "/"}
16/08/02 18:34:13 DEBUG ipc.Client: The ping interval is 60000 ms.
16/08/02 18:34:13 DEBUG ipc.Client: Connecting to /192.168.23.206:8020
16/08/02 18:34:13 DEBUG security.UserGroupInformation: PrivilegedAction
as:nn/hadoop-master@platalyticsrealm (auth:KERBEROS)
from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
NEGOTIATE

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: NEGOTIATE
auths {
  method: "TOKEN"
  mechanism: "DIGEST-MD5"
  protocol: ""
  serverId: "default"
  challenge:
"realm=\"default\",nonce=\"xHi0jI3ZHzKXd2aQ0Gqx4N1qcgbdJAWBCa36ZeSO\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
}
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "nn"
  serverId: "hadoop-master"
}

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get token info
proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
info:@org.apache.hadoop.security.token.TokenInfo(value=class
org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get kerberos info
proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
serverPrincipal=dfs.namenode.kerberos.principal)
16/08/02 18:34:13 DEBUG security.SaslRpcClient: RPC Server's Kerberos
principal name for
protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is
nn/hadoop-master@platalyticsrealm
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Creating SASL
GSSAPI(KERBEROS)  client to authenticate to service at hadoop-master
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Use KERBEROS authentication
for protocol ClientNamenodeProtocolPB
Found ticket for nn/hadoop-master@platalyticsrealm to go to
krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59
PKT 2016
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for nn/hadoop-master@platalyticsrealm to go to
krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59
PKT 2016
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KdcAccessibility: reset
>>> KrbKdcReq send: kdc=platalytics.com UDP:88, timeout=30000, number of
retries =3, #bytes=727
>>> KDCCommunication: kdc=platalytics.com UDP:88, timeout=30000,Attempt =1,
#bytes=727
>>> KrbKdcReq send: #bytes read=686
>>> KdcAccessibility: remove platalytics.com
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
Krb5Context setting mySeqNumber to: 822249937
Created InitSecContextToken:
0000: 01 00 6E 82 02 67 30 82   02 63 A0 03 02 01 05 A1  ..n..g0..c......
0010: 03 02 01 0E A2 07 03 05   00 20 00 00 00 A3 82 01  ......... ......
0020: 6F 61 82 01 6B 30 82 01   67 A0 03 02 01 05 A1 12  oa..k0..g.......
0030: 1B 10 70 6C 61 74 61 6C   79 74 69 63 73 72 65 61  ..platalyticsrea
0040: 6C 6D A2 1E 30 1C A0 03   02 01 00 A1 15 30 13 1B  lm..0........0..
0050: 02 6E 6E 1B 0D 68 61 64   6F 6F 70 2D 6D 61 73 74  .nn..hadoop-mast
0060: 65 72 A3 82 01 2A 30 82   01 26 A0 03 02 01 10 A1  er...*0..&......
0070: 03 02 01 04 A2 82 01 18   04 82 01 14 25 56 29 BE  ............%V).
0080: 2E AA 50 55 7B 2C 5C AC   BA 64 2D 4D 8D 9C 71 B1  ..PU.,\..d-M..q.
0090: 1A 99 14 81 4C 98 80 B2   65 86 6C 37 61 67 31 D1  ....L...e.l7ag1.
00A0: 6F F6 E7 7A F3 92 A5 9A   F0 BA A5 BE 1C 15 7F 14  o..z............
00B0: 85 7E B0 7A 81 3D 9C B6   00 80 43 00 2A 0C 89 6A  ...z.=....C.*..j
00C0: B1 49 EF 27 F9 97 A1 3E   5C 80 B7 0D 49 6C E0 A3  .I.'...>\...Il..
00D0: 73 BC C2 69 AE 92 88 26   C5 DA FD 6E AB 55 F7 60  s..i...&...n.U.`
00E0: D0 7E 3A A5 5D 78 4E 3F   3D 96 44 6B B9 8F EA D8  ..:.]xN?=.Dk....
00F0: 4E BA 70 F3 5C 25 4E ED   AD E2 76 09 FF 36 D8 6D  N.p.\%N...v..6.m
0100: A4 22 C3 93 10 04 04 F2   6C D4 04 C9 A9 14 95 47  ."......l......G
0110: 16 BA 62 6F 58 5F 4F 8E   38 23 A5 5C 1D 58 F8 D5  ..boX_O.8#.\.X..
0120: 87 23 3D 7F 0B A7 BE 18   25 1F F1 7B 4C 54 EC BD  .#=.....%...LT..
0130: A6 D4 05 4C 82 03 64 FD   5A 4E 24 D8 71 D5 5A 15  ...L..d.ZN$.q.Z.
0140: 4C 2E E3 12 88 19 19 09   C1 F9 31 9D 6E CE D4 6F  L.........1.n..o
0150: 7A 20 F6 82 BB F6 28 D1   ED A3 54 69 01 9E A4 4C  z ....(...Ti...L
0160: 40 E2 E0 FC F5 35 44 C1   25 8C 50 1F C0 01 1D C0  @....5D.%.P.....
0170: 63 A5 45 B8 56 DF F7 F8   CA 86 8B 96 0C 5C 49 EA  c.E.V........\I.
0180: F0 A9 70 9C 2E 0E 36 57   65 47 97 09 8C 24 F1 00  ..p...6WeG...$..
0190: A4 81 DA 30 81 D7 A0 03   02 01 10 A2 81 CF 04 81  ...0............
01A0: CC F1 F6 BE 3A A7 C0 1A   04 D0 72 DE 57 94 D1 FE  ....:.....r.W...
01B0: 16 7E E8 09 72 D7 83 54   B3 1C 98 59 36 86 78 12  ....r..T...Y6.x.
01C0: A5 02 E3 B6 8C C6 83 B5   C9 7C 53 A3 C9 79 AF C8  ..........S..y..
01D0: B8 1A B3 B2 A6 7E 02 1A   A5 9C 41 EA 08 87 A8 E5  ..........A.....
01E0: D1 0E ED 69 5C CA 33 63   24 C8 4B E1 57 D5 C3 AF  ...i\.3c$.K.W...
01F0: 39 0A DE F6 9F 63 3B 44   79 5B 29 F7 9A B0 2E 8B  9....c;Dy[).....
0200: 1C EF 4A 0B D9 3A 55 75   C5 38 B7 5C 50 11 0E 74  ..J..:Uu.8.\P..t
0210: BE 57 DC 70 30 DD AF 14   35 97 1C 14 11 70 46 FD  .W.p0...5....pF.
0220: F9 8C 14 60 DE 35 D8 DC   81 86 C7 31 1F F8 6A 65  ...`.5.....1..je
0230: 2D B7 8A EF F2 61 21 00   2C 8D 4F 3A 49 1E 24 80  -....a!.,.O:I.$.
0240: FA 56 D0 2D 0E 52 AE 29   2B 6A 4A C7 16 8F B5 D8  .V.-.R.)+jJ.....
0250: EC 41 18 03 34 F2 D8 94   79 82 C8 0D E2 10 72 39  .A..4...y.....r9
0260: 85 B9 F7 BB 54 5C 71 21   49 23 A5 4A D0           ....T\q!I#.J.

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
INITIATE
token:
"`\202\002x\006\t*\206H\206\367\022\001\002\002\001\000n\202\002g0\202\002c\240\003\002\001\005\241\003\002\001\016\242\a\003\005\000
\000\000\000\243\202\001oa\202\001k0\202\001g\240\003\002\001\005\241\022\033\020platalyticsrealm\242\0360\034\240\003\002\001\000\241\0250\023\033\002nn\033\rhadoop-master\243\202\001*0\202\001&\240\003\002\001\020\241\003\002\001\004\242\202\001\030\004\202\001\024%V)\276.\252PU{,\\\254\272d-M\215\234q\261\032\231\024\201L\230\200\262e\206l7ag1\321o\366\347z\363\222\245\232\360\272\245\276\034\025
\024\205~\260z\201=\234\266\000\200C\000*\f\211j\261I\357\'\371\227\241>\\\200\267\rIl\340\243s\274\302i\256\222\210&\305\332\375n\253U\367`\320~:\245]xN?=\226Dk\271\217\352\330N\272p\363\\%N\355\255\342v\t\3776\330m\244\"\303\223\020\004\004\362l\324\004\311\251\024\225G\026\272boX_O\2168#\245\\\035X\370\325\207#=
\v\247\276\030%\037\361{LT\354\275\246\324\005L\202\003d\375ZN$\330q\325Z\025L.\343\022\210\031\031\t\301\3711\235n\316\324oz
\366\202\273\366(\321\355\243Ti\001\236\244L@
\342\340\374\3655D\301%\214P\037\300\001\035\300c\245E\270V\337\367\370\312\206\213\226\f\\I\352\360\251p\234.\0166WeG\227\t\214$\361\000\244\201\3320\201\327\240\003\002\001\020\242\201\317\004\201\314\361\366\276:\247\300\032\004\320r\336W\224\321\376\026~\350\tr\327\203T\263\034\230Y6\206x\022\245\002\343\266\214\306\203\265\311|S\243\311y\257\310\270\032\263\262\246~\002\032\245\234A\352\b\207\250\345\321\016\355i\\\3123c$\310K\341W\325\303\2579\n\336\366\237c;Dy[)\367\232\260.\213\034\357J\v\331:Uu\3058\267\\P\021\016t\276W\334p0\335\257\0245\227\034\024\021pF\375\371\214\024`\3365\330\334\201\206\3071\037\370je-\267\212\357\362a!\000,\215O:I\036$\200\372V\320-\016R\256)+jJ\307\026\217\265\330\354A\030\0034\362\330\224y\202\310\r\342\020r9\205\271\367\273T\\q!I#\245J\320"
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "nn"
  serverId: "hadoop-master"
}

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: CHALLENGE
token:
"`l\006\t*\206H\206\367\022\001\002\002\002\000o]0[\240\003\002\001\005\241\003\002\001\017\242O0M\240\003\002\001\020\242F\004D\337\316\251\336\365\261O@\377
\"\035\203\002\357Z\231e\332\357\364\204>d\325\"\340\263\2302\031\277\023G\342=\355\334)\303\271\t\376\252\225\207\033\000\243\332\252\335{\"\033\025
\fW\225\300\375\272\201\367\216\371\273"

Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
Krb5Context setting peerSeqNumber to: 766454664
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
RESPONSE
token: ""

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: CHALLENGE
token:
"`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\377\272
\237\354\300\003\367{\207A\267\371\245\327\374\333\021\026\375}\353\035\254\327\305\272\373\305\365L\022\374.A\203\002\001\001\000\000\004\004\004\004"

Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04
00 ff ff ff ff ba 20 9f ec c0 03 f7 7b 87 41 b7 f9 a5 d7 fc db 11 16 fd 7d
eb 1d ac d7 c5 ba fb c5 f5 4c 12 fc 2e 41 83 02 01 01 00 00 04 04 04 04 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04 00
ff ff ff ff 33 b9 e5 96 b6 c8 d3 80 4f 8a a1 5b 44 c9 b6 76 ea fe ec 80 be
37 12 e1 04 cc e5 0f 2a f8 16 1b 9e 72 17 dc 01 01 00 00 04 04 04 04 ]
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
RESPONSE
token:
"`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\3773\271\345\226\266\310\323\200O\212\241[D\311\266v\352\376\354\200\2767\022\341\004\314\345\017*\370\026\033\236r\027\334\001\001\000\000\004\004\004\004"

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: SUCCESS

16/08/02 18:34:13 DEBUG ipc.Client: Negotiated QOP is :auth
16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: starting,
having connections 1
16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #0
16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #0
16/08/02 18:34:13 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 594ms
16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- /
192.168.23.206:8020: getFileInfo {fs { fileType: IS_DIR path: "" length: 0
permission { perm: 493 } owner: "hdfs" group: "supergroup"
modification_time: 1470131070337 access_time: 0 block_replication: 0
blocksize: 0 fileId: 16385 childrenNum: 1 storagePolicy: 0 }}
16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Call -> /
192.168.23.206:8020: getListing {src: "/" startAfter: "" needLocation:
false}
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #1
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #1
16/08/02 18:34:14 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 7ms
16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- /
192.168.23.206:8020: getListing {dirList { partialListing { fileType:
IS_DIR path: "ranger" length: 0 permission { perm: 493 } owner: "hdfs"
group: "supergroup" modification_time: 1470131070364 access_time: 0
block_replication: 0 blocksize: 0 fileId: 16386 childrenNum: 1
storagePolicy: 0 } remainingEntries: 0 }}
*Found 1 items*
*drwxr-xr-x   - hdfs supergroup          0 2016-08-02 14:44 /ranger*
16/08/02 18:34:14 DEBUG ipc.Client: stopping client from cache:
org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:14 DEBUG ipc.Client: removing client from cache:
org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:14 DEBUG ipc.Client: stopping actual client because no more
references remain: org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:14 DEBUG ipc.Client: Stopping client
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: closed
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: stopped,
remaining connections 0

Thanks

Re: issue starting regionserver with SASL authentication failed

Posted by Aneela Saleem <an...@platalytics.com>.
I have enabled Kerberos debugging in Hadoop command line, so when i run the
"hadoop fs -ls /" command, i get following output, I can't interpret this.
Can you please tell me is something wrong with Kerberos configuration or
everything is fine ?


16/08/02 18:34:10 DEBUG util.Shell: setsid exited with exit code 0
16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL
jar:file:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.7.2.jar!/core-default.xml
16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream
sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@4fbc7b65
16/08/02 18:34:10 DEBUG conf.Configuration: parsing URL
file:/usr/local/hadoop/etc/hadoop/core-site.xml
16/08/02 18:34:10 DEBUG conf.Configuration: parsing input stream
java.io.BufferedInputStream@69c1adfa
16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
of successful kerberos logins and latency (milliseconds)], about=,
always=false, type=DEFAULT, sampleName=Ops)
16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate
of failed kerberos logins and latency (milliseconds)], about=,
always=false, type=DEFAULT, sampleName=Ops)
16/08/02 18:34:11 DEBUG lib.MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with
annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
value=[GetGroups], about=, always=false, type=DEFAULT, sampleName=Ops)
16/08/02 18:34:11 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group
related metrics
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
16/08/02 18:34:11 DEBUG security.Groups:  Creating new Groups object
16/08/02 18:34:11 DEBUG security.Groups: Group mapping
impl=org.apache.hadoop.security.LdapGroupsMapping; cacheTimeout=300000;
warningDeltaMs=5000
>>>KinitOptions cache name is /tmp/krb5cc_0
>>>DEBUG <CCacheInputStream>  client principal is
nn/hadoop-master@platalyticsrealm
>>>DEBUG <CCacheInputStream> server principal is
krbtgt/platalyticsrealm@platalyticsrealm
>>>DEBUG <CCacheInputStream> key type: 16
>>>DEBUG <CCacheInputStream> auth time: Tue Aug 02 18:23:59 PKT 2016
>>>DEBUG <CCacheInputStream> start time: Tue Aug 02 18:23:59 PKT 2016
>>>DEBUG <CCacheInputStream> end time: Wed Aug 03 06:23:59 PKT 2016
>>>DEBUG <CCacheInputStream> renew_till time: Tue Aug 09 18:23:59 PKT 2016
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>DEBUG <CCacheInputStream>  client principal is
nn/hadoop-master@platalyticsrealm
>>>DEBUG <CCacheInputStream> server principal is
X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/platalyticsrealm@platalyticsrealm
>>>DEBUG <CCacheInputStream> key type: 0
>>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 05:00:00 PKT 1970
>>>DEBUG <CCacheInputStream> start time: null
>>>DEBUG <CCacheInputStream> end time: Thu Jan 01 05:00:00 PKT 1970
>>>DEBUG <CCacheInputStream> renew_till time: null
>>> CCacheInputStream: readFlags()
16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login
16/08/02 18:34:11 DEBUG security.UserGroupInformation: hadoop login commit
16/08/02 18:34:11 DEBUG security.UserGroupInformation: using kerberos
user:nn/hadoop-master@platalyticsrealm
16/08/02 18:34:11 DEBUG security.UserGroupInformation: Using user:
"nn/hadoop-master@platalyticsrealm" with name
nn/hadoop-master@platalyticsrealm
16/08/02 18:34:11 DEBUG security.UserGroupInformation: User entry:
"nn/hadoop-master@platalyticsrealm"
16/08/02 18:34:11 DEBUG security.UserGroupInformation: UGI
loginUser:nn/hadoop-master@platalyticsrealm (auth:KERBEROS)
16/08/02 18:34:12 DEBUG security.UserGroupInformation: Found tgt Ticket
(hex) =
0000: 61 82 01 72 30 82 01 6E   A0 03 02 01 05 A1 12 1B  a..r0..n........
0010: 10 70 6C 61 74 61 6C 79   74 69 63 73 72 65 61 6C  .platalyticsreal
0020: 6D A2 25 30 23 A0 03 02   01 02 A1 1C 30 1A 1B 06  m.%0#.......0...
0030: 6B 72 62 74 67 74 1B 10   70 6C 61 74 61 6C 79 74  krbtgt..platalyt
0040: 69 63 73 72 65 61 6C 6D   A3 82 01 2A 30 82 01 26  icsrealm...*0..&
0050: A0 03 02 01 10 A1 03 02   01 01 A2 82 01 18 04 82  ................
0060: 01 14 A5 A9 41 A6 B7 0E   8F 70 F4 03 41 64 8D DC  ....A....p..Ad..
0070: 78 2F FB 08 58 C9 39 44   CF D0 8D B0 85 09 62 8C  x/..X.9D......b.
0080: 40 CF 45 13 D3 B9 CD 38   84 92 33 24 B2 0D C1 65  @.E....8..3$...e
0090: C7 1B 0D 3E F2 92 A2 8B   58 34 77 5F F6 E3 AA B6  ...>....X4w_....
00A0: EB 8E 58 46 AC 54 DB 9B   79 3E ED A1 83 0C D3 D3  ..XF.T..y>......
00B0: 02 8B 42 52 6D 92 F1 39   BA E7 56 D4 BA A6 03 B6  ..BRm..9..V.....
00C0: 16 5A DC 1A 69 F4 DF A5   CD F6 48 AC 08 32 D3 AD  .Z..i.....H..2..
00D0: 22 8E E9 52 00 93 78 41   1C 26 4F 0B 42 2C EF E9  "..R..xA.&O.B,..
00E0: B8 0E 84 39 E4 AF 3A 60   7D 04 EE 70 18 C0 E7 21  ...9..:`...p...!
00F0: 0B 70 18 42 33 5E D9 CA   94 C0 6F 6A C0 39 72 7B  .p.B3^....oj.9r.
0100: FD 6E F1 09 CE 2D 02 EA   DA 52 5C 1B B2 18 36 0E  .n...-...R\...6.
0110: 54 94 DD 7A 47 A8 F2 36   53 18 3D D7 5C 68 58 71  T..zG..6S.=.\hXq
0120: 63 DB 36 88 B9 87 62 DC   BA 86 C3 F0 55 05 D8 15  c.6...b.....U...
0130: 6E 70 FD 8E 64 63 3D 51   36 EC 9E 63 30 77 BE 98  np..dc=Q6..c0w..
0140: 1D A0 DC 97 04 6F 03 AB   12 52 F8 68 7C 6C D0 88  .....o...R.h.l..
0150: 16 FC 17 69 3E 02 4B 59   E8 22 B3 1B 13 70 B2 6A  ...i>.KY."...p.j
0160: 3F 05 3B 1C 91 3D 03 A8   30 64 1C B1 59 42 17 FB  ?.;..=..0d..YB..
0170: 1B B2 76 E0 BC 49                                  ..v..I

Client Principal = nn/hadoop-master@platalyticsrealm
Server Principal = krbtgt/platalyticsrealm@platalyticsrealm
Session Key = EncryptionKey: keyType=16 keyBytes (hex dump)=
0000: B5 4A 9B 0E 1C 6D 1C 34   D5 DF DA F2 9D 4C C2 FE  .J...m.4.....L..
0010: D9 0D 67 A2 79 6D 8C 0D                            ..g.ym..


Forwardable Ticket true
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket true
Initial Ticket true
Auth Time = Tue Aug 02 18:23:59 PKT 2016
Start Time = Tue Aug 02 18:23:59 PKT 2016
End Time = Wed Aug 03 06:23:59 PKT 2016
Renew Till = Tue Aug 09 18:23:59 PKT 2016
Client Addresses  Null
16/08/02 18:34:12 DEBUG security.UserGroupInformation: Current time is
1470144852023
16/08/02 18:34:12 DEBUG security.UserGroupInformation: Next refresh is
1470178799000
16/08/02 18:34:12 TRACE tracing.SpanReceiverHost: No span receiver names
found in dfs.client.htrace.spanreceiver.classes.
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal:
dfs.client.use.legacy.blockreader.local = false
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit
= false
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal:
dfs.client.domain.socket.data.traffic = false
16/08/02 18:34:12 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
16/08/02 18:34:12 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
16/08/02 18:34:12 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER,
rpcRequestWrapperClass=class
org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,
rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4219a40f
16/08/02 18:34:12 DEBUG ipc.Client: getting client out of cache:
org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Trying to load the
custom-built native-hadoop library...
16/08/02 18:34:13 DEBUG util.NativeCodeLoader: Loaded the native-hadoop
library
16/08/02 18:34:13 DEBUG unix.DomainSocketWatcher:
org.apache.hadoop.net.unix.DomainSocketWatcher$2@1a1ff7d1: starting with
interruptCheckPeriodMs = 60000
16/08/02 18:34:13 TRACE unix.DomainSocketWatcher:
DomainSocketWatcher(1934811148): adding notificationSocket 191, connected
to 190
16/08/02 18:34:13 DEBUG util.PerformanceAdvisory: Both short-circuit local
reads and UNIX domain socket are disabled.
16/08/02 18:34:13 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not
using SaslPropertiesResolver, no QOP found in configuration for
dfs.data.transfer.protection
16/08/02 18:34:13 TRACE ipc.ProtobufRpcEngine: 1: Call -> /
192.168.23.206:8020: getFileInfo {src: "/"}
16/08/02 18:34:13 DEBUG ipc.Client: The ping interval is 60000 ms.
16/08/02 18:34:13 DEBUG ipc.Client: Connecting to /192.168.23.206:8020
16/08/02 18:34:13 DEBUG security.UserGroupInformation: PrivilegedAction
as:nn/hadoop-master@platalyticsrealm (auth:KERBEROS)
from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
NEGOTIATE

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: NEGOTIATE
auths {
  method: "TOKEN"
  mechanism: "DIGEST-MD5"
  protocol: ""
  serverId: "default"
  challenge:
"realm=\"default\",nonce=\"xHi0jI3ZHzKXd2aQ0Gqx4N1qcgbdJAWBCa36ZeSO\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
}
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "nn"
  serverId: "hadoop-master"
}

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get token info
proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
info:@org.apache.hadoop.security.token.TokenInfo(value=class
org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Get kerberos info
proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB
info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=,
serverPrincipal=dfs.namenode.kerberos.principal)
16/08/02 18:34:13 DEBUG security.SaslRpcClient: RPC Server's Kerberos
principal name for
protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is
nn/hadoop-master@platalyticsrealm
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Creating SASL
GSSAPI(KERBEROS)  client to authenticate to service at hadoop-master
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Use KERBEROS authentication
for protocol ClientNamenodeProtocolPB
Found ticket for nn/hadoop-master@platalyticsrealm to go to
krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59
PKT 2016
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for nn/hadoop-master@platalyticsrealm to go to
krbtgt/platalyticsrealm@platalyticsrealm expiring on Wed Aug 03 06:23:59
PKT 2016
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KdcAccessibility: reset
>>> KrbKdcReq send: kdc=platalytics.com UDP:88, timeout=30000, number of
retries =3, #bytes=727
>>> KDCCommunication: kdc=platalytics.com UDP:88, timeout=30000,Attempt =1,
#bytes=727
>>> KrbKdcReq send: #bytes read=686
>>> KdcAccessibility: remove platalytics.com
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
>>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
Krb5Context setting mySeqNumber to: 822249937
Created InitSecContextToken:
0000: 01 00 6E 82 02 67 30 82   02 63 A0 03 02 01 05 A1  ..n..g0..c......
0010: 03 02 01 0E A2 07 03 05   00 20 00 00 00 A3 82 01  ......... ......
0020: 6F 61 82 01 6B 30 82 01   67 A0 03 02 01 05 A1 12  oa..k0..g.......
0030: 1B 10 70 6C 61 74 61 6C   79 74 69 63 73 72 65 61  ..platalyticsrea
0040: 6C 6D A2 1E 30 1C A0 03   02 01 00 A1 15 30 13 1B  lm..0........0..
0050: 02 6E 6E 1B 0D 68 61 64   6F 6F 70 2D 6D 61 73 74  .nn..hadoop-mast
0060: 65 72 A3 82 01 2A 30 82   01 26 A0 03 02 01 10 A1  er...*0..&......
0070: 03 02 01 04 A2 82 01 18   04 82 01 14 25 56 29 BE  ............%V).
0080: 2E AA 50 55 7B 2C 5C AC   BA 64 2D 4D 8D 9C 71 B1  ..PU.,\..d-M..q.
0090: 1A 99 14 81 4C 98 80 B2   65 86 6C 37 61 67 31 D1  ....L...e.l7ag1.
00A0: 6F F6 E7 7A F3 92 A5 9A   F0 BA A5 BE 1C 15 7F 14  o..z............
00B0: 85 7E B0 7A 81 3D 9C B6   00 80 43 00 2A 0C 89 6A  ...z.=....C.*..j
00C0: B1 49 EF 27 F9 97 A1 3E   5C 80 B7 0D 49 6C E0 A3  .I.'...>\...Il..
00D0: 73 BC C2 69 AE 92 88 26   C5 DA FD 6E AB 55 F7 60  s..i...&...n.U.`
00E0: D0 7E 3A A5 5D 78 4E 3F   3D 96 44 6B B9 8F EA D8  ..:.]xN?=.Dk....
00F0: 4E BA 70 F3 5C 25 4E ED   AD E2 76 09 FF 36 D8 6D  N.p.\%N...v..6.m
0100: A4 22 C3 93 10 04 04 F2   6C D4 04 C9 A9 14 95 47  ."......l......G
0110: 16 BA 62 6F 58 5F 4F 8E   38 23 A5 5C 1D 58 F8 D5  ..boX_O.8#.\.X..
0120: 87 23 3D 7F 0B A7 BE 18   25 1F F1 7B 4C 54 EC BD  .#=.....%...LT..
0130: A6 D4 05 4C 82 03 64 FD   5A 4E 24 D8 71 D5 5A 15  ...L..d.ZN$.q.Z.
0140: 4C 2E E3 12 88 19 19 09   C1 F9 31 9D 6E CE D4 6F  L.........1.n..o
0150: 7A 20 F6 82 BB F6 28 D1   ED A3 54 69 01 9E A4 4C  z ....(...Ti...L
0160: 40 E2 E0 FC F5 35 44 C1   25 8C 50 1F C0 01 1D C0  @....5D.%.P.....
0170: 63 A5 45 B8 56 DF F7 F8   CA 86 8B 96 0C 5C 49 EA  c.E.V........\I.
0180: F0 A9 70 9C 2E 0E 36 57   65 47 97 09 8C 24 F1 00  ..p...6WeG...$..
0190: A4 81 DA 30 81 D7 A0 03   02 01 10 A2 81 CF 04 81  ...0............
01A0: CC F1 F6 BE 3A A7 C0 1A   04 D0 72 DE 57 94 D1 FE  ....:.....r.W...
01B0: 16 7E E8 09 72 D7 83 54   B3 1C 98 59 36 86 78 12  ....r..T...Y6.x.
01C0: A5 02 E3 B6 8C C6 83 B5   C9 7C 53 A3 C9 79 AF C8  ..........S..y..
01D0: B8 1A B3 B2 A6 7E 02 1A   A5 9C 41 EA 08 87 A8 E5  ..........A.....
01E0: D1 0E ED 69 5C CA 33 63   24 C8 4B E1 57 D5 C3 AF  ...i\.3c$.K.W...
01F0: 39 0A DE F6 9F 63 3B 44   79 5B 29 F7 9A B0 2E 8B  9....c;Dy[).....
0200: 1C EF 4A 0B D9 3A 55 75   C5 38 B7 5C 50 11 0E 74  ..J..:Uu.8.\P..t
0210: BE 57 DC 70 30 DD AF 14   35 97 1C 14 11 70 46 FD  .W.p0...5....pF.
0220: F9 8C 14 60 DE 35 D8 DC   81 86 C7 31 1F F8 6A 65  ...`.5.....1..je
0230: 2D B7 8A EF F2 61 21 00   2C 8D 4F 3A 49 1E 24 80  -....a!.,.O:I.$.
0240: FA 56 D0 2D 0E 52 AE 29   2B 6A 4A C7 16 8F B5 D8  .V.-.R.)+jJ.....
0250: EC 41 18 03 34 F2 D8 94   79 82 C8 0D E2 10 72 39  .A..4...y.....r9
0260: 85 B9 F7 BB 54 5C 71 21   49 23 A5 4A D0           ....T\q!I#.J.

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
INITIATE
token:
"`\202\002x\006\t*\206H\206\367\022\001\002\002\001\000n\202\002g0\202\002c\240\003\002\001\005\241\003\002\001\016\242\a\003\005\000
\000\000\000\243\202\001oa\202\001k0\202\001g\240\003\002\001\005\241\022\033\020platalyticsrealm\242\0360\034\240\003\002\001\000\241\0250\023\033\002nn\033\rhadoop-master\243\202\001*0\202\001&\240\003\002\001\020\241\003\002\001\004\242\202\001\030\004\202\001\024%V)\276.\252PU{,\\\254\272d-M\215\234q\261\032\231\024\201L\230\200\262e\206l7ag1\321o\366\347z\363\222\245\232\360\272\245\276\034\025
\024\205~\260z\201=\234\266\000\200C\000*\f\211j\261I\357\'\371\227\241>\\\200\267\rIl\340\243s\274\302i\256\222\210&\305\332\375n\253U\367`\320~:\245]xN?=\226Dk\271\217\352\330N\272p\363\\%N\355\255\342v\t\3776\330m\244\"\303\223\020\004\004\362l\324\004\311\251\024\225G\026\272boX_O\2168#\245\\\035X\370\325\207#=
\v\247\276\030%\037\361{LT\354\275\246\324\005L\202\003d\375ZN$\330q\325Z\025L.\343\022\210\031\031\t\301\3711\235n\316\324oz
\366\202\273\366(\321\355\243Ti\001\236\244L@
\342\340\374\3655D\301%\214P\037\300\001\035\300c\245E\270V\337\367\370\312\206\213\226\f\\I\352\360\251p\234.\0166WeG\227\t\214$\361\000\244\201\3320\201\327\240\003\002\001\020\242\201\317\004\201\314\361\366\276:\247\300\032\004\320r\336W\224\321\376\026~\350\tr\327\203T\263\034\230Y6\206x\022\245\002\343\266\214\306\203\265\311|S\243\311y\257\310\270\032\263\262\246~\002\032\245\234A\352\b\207\250\345\321\016\355i\\\3123c$\310K\341W\325\303\2579\n\336\366\237c;Dy[)\367\232\260.\213\034\357J\v\331:Uu\3058\267\\P\021\016t\276W\334p0\335\257\0245\227\034\024\021pF\375\371\214\024`\3365\330\334\201\206\3071\037\370je-\267\212\357\362a!\000,\215O:I\036$\200\372V\320-\016R\256)+jJ\307\026\217\265\330\354A\030\0034\362\330\224y\202\310\r\342\020r9\205\271\367\273T\\q!I#\245J\320"
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "nn"
  serverId: "hadoop-master"
}

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: CHALLENGE
token:
"`l\006\t*\206H\206\367\022\001\002\002\002\000o]0[\240\003\002\001\005\241\003\002\001\017\242O0M\240\003\002\001\020\242F\004D\337\316\251\336\365\261O@\377
\"\035\203\002\357Z\231e\332\357\364\204>d\325\"\340\263\2302\031\277\023G\342=\355\334)\303\271\t\376\252\225\207\033\000\243\332\252\335{\"\033\025
\fW\225\300\375\272\201\367\216\371\273"

Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS
>>> EType: sun.security.krb5.internal.crypto.Des3CbcHmacSha1KdEType
Krb5Context setting peerSeqNumber to: 766454664
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
RESPONSE
token: ""

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: CHALLENGE
token:
"`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\377\272
\237\354\300\003\367{\207A\267\371\245\327\374\333\021\026\375}\353\035\254\327\305\272\373\305\365L\022\374.A\203\002\001\001\000\000\004\004\004\004"

Krb5Context.unwrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04
00 ff ff ff ff ba 20 9f ec c0 03 f7 7b 87 41 b7 f9 a5 d7 fc db 11 16 fd 7d
eb 1d ac d7 c5 ba fb c5 f5 4c 12 fc 2e 41 83 02 01 01 00 00 04 04 04 04 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[60 3f 06 09 2a 86 48 86 f7 12 01 02 02 02 01 04 00
ff ff ff ff 33 b9 e5 96 b6 c8 d3 80 4f 8a a1 5b 44 c9 b6 76 ea fe ec 80 be
37 12 e1 04 cc e5 0f 2a f8 16 1b 9e 72 17 dc 01 01 00 00 04 04 04 04 ]
16/08/02 18:34:13 DEBUG security.SaslRpcClient: Sending sasl message state:
RESPONSE
token:
"`?\006\t*\206H\206\367\022\001\002\002\002\001\004\000\377\377\377\3773\271\345\226\266\310\323\200O\212\241[D\311\266v\352\376\354\200\2767\022\341\004\314\345\017*\370\026\033\236r\027\334\001\001\000\000\004\004\004\004"

16/08/02 18:34:13 DEBUG security.SaslRpcClient: Received SASL message
state: SUCCESS

16/08/02 18:34:13 DEBUG ipc.Client: Negotiated QOP is :auth
16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: starting,
having connections 1
16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #0
16/08/02 18:34:13 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #0
16/08/02 18:34:13 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 594ms
16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- /
192.168.23.206:8020: getFileInfo {fs { fileType: IS_DIR path: "" length: 0
permission { perm: 493 } owner: "hdfs" group: "supergroup"
modification_time: 1470131070337 access_time: 0 block_replication: 0
blocksize: 0 fileId: 16385 childrenNum: 1 storagePolicy: 0 }}
16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Call -> /
192.168.23.206:8020: getListing {src: "/" startAfter: "" needLocation:
false}
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm sending #1
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm got value #1
16/08/02 18:34:14 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 7ms
16/08/02 18:34:14 TRACE ipc.ProtobufRpcEngine: 1: Response <- /
192.168.23.206:8020: getListing {dirList { partialListing { fileType:
IS_DIR path: "ranger" length: 0 permission { perm: 493 } owner: "hdfs"
group: "supergroup" modification_time: 1470131070364 access_time: 0
block_replication: 0 blocksize: 0 fileId: 16386 childrenNum: 1
storagePolicy: 0 } remainingEntries: 0 }}
*Found 1 items*
*drwxr-xr-x   - hdfs supergroup          0 2016-08-02 14:44 /ranger*
16/08/02 18:34:14 DEBUG ipc.Client: stopping client from cache:
org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:14 DEBUG ipc.Client: removing client from cache:
org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:14 DEBUG ipc.Client: stopping actual client because no more
references remain: org.apache.hadoop.ipc.Client@5e0df7af
16/08/02 18:34:14 DEBUG ipc.Client: Stopping client
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: closed
16/08/02 18:34:14 DEBUG ipc.Client: IPC Client (1594470328) connection to /
192.168.23.206:8020 from nn/hadoop-master@platalyticsrealm: stopped,
remaining connections 0


On Tue, Aug 2, 2016 at 9:03 PM, Dima Spivak <ds...@cloudera.com> wrote:

> Hm, not sure what to say. The error seems to be pointing at not having a
> TGT...
>
> -Dima
>
> On Tue, Aug 2, 2016 at 12:45 AM, Aneela Saleem <an...@platalytics.com>
> wrote:
>
> > Yes, I have kinit'd as the service user. But still getting error
> >
> > On Tue, Aug 2, 2016 at 3:05 AM, Dima Spivak <ds...@cloudera.com>
> wrote:
> >
> > > The stacktrace suggests you don't have a ticket-granting ticket. Have
> you
> > > kinit'd as the service user?
> > >
> > > -Dima
> > >
> > > On Sun, Jul 31, 2016 at 11:19 PM, Aneela Saleem <
> aneela@platalytics.com>
> > > wrote:
> > >
> > > > Hi Dima,
> > > >
> > > > I followed the official reference guide now, but still same error.
> > > > Attached is the hbase-site.xml file, please have a look. What's wrong
> > > there?
> > > >
> > > > On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak <ds...@cloudera.com>
> > > > wrote:
> > > >
> > > >> I haven't looked in detail at your hbase-site.xml, but if you're
> > running
> > > >> Apache HBase (and not a CDH release), I might recommend using the
> > > official
> > > >> reference guide [1] to configure your cluster instead of the CDH
> 4.2.0
> > > >> docs
> > > >> since those would correspond to HBase 0.94, and might well have
> > > different
> > > >> steps required to set up security. If you are trying out CDH HBase,
> be
> > > >> sure
> > > >> to use up-to-date documentation for your release.
> > > >>
> > > >> Let us know how it goes.
> > > >>
> > > >> [1] https://hbase.apache.org/book.html#hbase.secure.configuration
> > > >>
> > > >> -Dima
> > > >>
> > > >> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <
> > aneela@platalytics.com
> > > >
> > > >> wrote:
> > > >>
> > > >> > Hi Dima,
> > > >> >
> > > >> > I'm running Hbase version 1.2.2
> > > >> >
> > > >> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <
> dspivak@cloudera.com>
> > > >> wrote:
> > > >> >
> > > >> > > Hi Aneela,
> > > >> > >
> > > >> > > What version of HBase are you running?
> > > >> > >
> > > >> > > -Dima
> > > >> > >
> > > >> > > On Thursday, July 28, 2016, Aneela Saleem <
> aneela@platalytics.com
> > >
> > > >> > wrote:
> > > >> > >
> > > >> > > > Hi,
> > > >> > > >
> > > >> > > > I have successfully configured Zookeeper with Kerberos
> > > >> authentication.
> > > >> > > Now
> > > >> > > > i'm facing issue while configuring HBase with Kerberos
> > > >> authentication.
> > > >> > I
> > > >> > > > have followed this link
> > > >> > > > <
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> > > >> > > >.
> > > >> > > > Attached are the configuration files, i.e., hbase-site.xml and
> > > >> > > > zk-jaas.conf.
> > > >> > > >
> > > >> > > > Following are the logs from regionserver:
> > > >> > > >
> > > >> > > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > > >> > > > 192.168.23.206:16020] regionserver.HRegionServer: error
> telling
> > > >> master
> > > >> > > we
> > > >> > > > are up
> > > >> > > > com.google.protobuf.ServiceException: java.io.IOException:
> Could
> > > not
> > > >> > set
> > > >> > > > up IO Streams to hadoop-master/192.168.23.206:16000
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > > >> > > > at java.lang.Thread.run(Thread.java:745)
> > > >> > > > Caused by: java.io.IOException: Could not set up IO Streams to
> > > >> > > > hadoop-master/192.168.23.206:16000
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > > >> > > > at
> > > >> > >
> > > >>
> > org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > > >> > > > ... 5 more
> > > >> > > > Caused by: java.lang.RuntimeException: SASL authentication
> > failed.
> > > >> The
> > > >> > > > most likely cause is missing or invalid credentials. Consider
> > > >> 'kinit'.
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > > >> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > > >> > > > ... 9 more
> > > >> > > > Caused by: javax.security.sasl.SaslException: GSS initiate
> > failed
> > > >> > [Caused
> > > >> > > > by GSSException: No valid credentials provided (Mechanism
> level:
> > > >> Failed
> > > >> > > to
> > > >> > > > find any Kerberos tgt)]
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > > >> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > > >> > > > ... 9 more
> > > >> > > > Caused by: GSSException: No valid credentials provided
> > (Mechanism
> > > >> > level:
> > > >> > > > Failed to find any Kerberos tgt)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > > >> > > > at
> > > >> > >
> > > >>
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > > >> > > > at
> > > >> > >
> > > >>
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > > >> > > > at
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> > > >> > > >
> > > >> > > >
> > > >> > > > Please have a look, whats going wrong here?
> > > >> > > >
> > > >> > > > Thanks
> > > >> > > >
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > > >
> > > >
> > >
> > >
> > > --
> > > -Dima
> > >
> >
>
>
>
> --
> -Dima
>

Re: issue starting regionserver with SASL authentication failed

Posted by Dima Spivak <ds...@cloudera.com>.
Hm, not sure what to say. The error seems to be pointing at not having a
TGT...

-Dima

On Tue, Aug 2, 2016 at 12:45 AM, Aneela Saleem <an...@platalytics.com>
wrote:

> Yes, I have kinit'd as the service user. But still getting error
>
> On Tue, Aug 2, 2016 at 3:05 AM, Dima Spivak <ds...@cloudera.com> wrote:
>
> > The stacktrace suggests you don't have a ticket-granting ticket. Have you
> > kinit'd as the service user?
> >
> > -Dima
> >
> > On Sun, Jul 31, 2016 at 11:19 PM, Aneela Saleem <an...@platalytics.com>
> > wrote:
> >
> > > Hi Dima,
> > >
> > > I followed the official reference guide now, but still same error.
> > > Attached is the hbase-site.xml file, please have a look. What's wrong
> > there?
> > >
> > > On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak <ds...@cloudera.com>
> > > wrote:
> > >
> > >> I haven't looked in detail at your hbase-site.xml, but if you're
> running
> > >> Apache HBase (and not a CDH release), I might recommend using the
> > official
> > >> reference guide [1] to configure your cluster instead of the CDH 4.2.0
> > >> docs
> > >> since those would correspond to HBase 0.94, and might well have
> > different
> > >> steps required to set up security. If you are trying out CDH HBase, be
> > >> sure
> > >> to use up-to-date documentation for your release.
> > >>
> > >> Let us know how it goes.
> > >>
> > >> [1] https://hbase.apache.org/book.html#hbase.secure.configuration
> > >>
> > >> -Dima
> > >>
> > >> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <
> aneela@platalytics.com
> > >
> > >> wrote:
> > >>
> > >> > Hi Dima,
> > >> >
> > >> > I'm running Hbase version 1.2.2
> > >> >
> > >> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <ds...@cloudera.com>
> > >> wrote:
> > >> >
> > >> > > Hi Aneela,
> > >> > >
> > >> > > What version of HBase are you running?
> > >> > >
> > >> > > -Dima
> > >> > >
> > >> > > On Thursday, July 28, 2016, Aneela Saleem <aneela@platalytics.com
> >
> > >> > wrote:
> > >> > >
> > >> > > > Hi,
> > >> > > >
> > >> > > > I have successfully configured Zookeeper with Kerberos
> > >> authentication.
> > >> > > Now
> > >> > > > i'm facing issue while configuring HBase with Kerberos
> > >> authentication.
> > >> > I
> > >> > > > have followed this link
> > >> > > > <
> > >> > >
> > >> >
> > >>
> >
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> > >> > > >.
> > >> > > > Attached are the configuration files, i.e., hbase-site.xml and
> > >> > > > zk-jaas.conf.
> > >> > > >
> > >> > > > Following are the logs from regionserver:
> > >> > > >
> > >> > > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > >> > > > 192.168.23.206:16020] regionserver.HRegionServer: error telling
> > >> master
> > >> > > we
> > >> > > > are up
> > >> > > > com.google.protobuf.ServiceException: java.io.IOException: Could
> > not
> > >> > set
> > >> > > > up IO Streams to hadoop-master/192.168.23.206:16000
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > >> > > > at java.lang.Thread.run(Thread.java:745)
> > >> > > > Caused by: java.io.IOException: Could not set up IO Streams to
> > >> > > > hadoop-master/192.168.23.206:16000
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > >> > > > at
> > >> > >
> > >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > >> > > > ... 5 more
> > >> > > > Caused by: java.lang.RuntimeException: SASL authentication
> failed.
> > >> The
> > >> > > > most likely cause is missing or invalid credentials. Consider
> > >> 'kinit'.
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > >> > > > at java.security.AccessController.doPrivileged(Native Method)
> > >> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > >> > > > ... 9 more
> > >> > > > Caused by: javax.security.sasl.SaslException: GSS initiate
> failed
> > >> > [Caused
> > >> > > > by GSSException: No valid credentials provided (Mechanism level:
> > >> Failed
> > >> > > to
> > >> > > > find any Kerberos tgt)]
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > >> > > > at java.security.AccessController.doPrivileged(Native Method)
> > >> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > >> > > > ... 9 more
> > >> > > > Caused by: GSSException: No valid credentials provided
> (Mechanism
> > >> > level:
> > >> > > > Failed to find any Kerberos tgt)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > >> > > > at
> > >> > >
> > >>
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > >> > > > at
> > >> > >
> > >>
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > >> > > > at
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> > >> > > >
> > >> > > >
> > >> > > > Please have a look, whats going wrong here?
> > >> > > >
> > >> > > > Thanks
> > >> > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> > >
> > >
> >
> >
> > --
> > -Dima
> >
>



-- 
-Dima

Re: issue starting regionserver with SASL authentication failed

Posted by Aneela Saleem <an...@platalytics.com>.
Yes, I have kinit'd as the service user. But still getting error

On Tue, Aug 2, 2016 at 3:05 AM, Dima Spivak <ds...@cloudera.com> wrote:

> The stacktrace suggests you don't have a ticket-granting ticket. Have you
> kinit'd as the service user?
>
> -Dima
>
> On Sun, Jul 31, 2016 at 11:19 PM, Aneela Saleem <an...@platalytics.com>
> wrote:
>
> > Hi Dima,
> >
> > I followed the official reference guide now, but still same error.
> > Attached is the hbase-site.xml file, please have a look. What's wrong
> there?
> >
> > On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak <ds...@cloudera.com>
> > wrote:
> >
> >> I haven't looked in detail at your hbase-site.xml, but if you're running
> >> Apache HBase (and not a CDH release), I might recommend using the
> official
> >> reference guide [1] to configure your cluster instead of the CDH 4.2.0
> >> docs
> >> since those would correspond to HBase 0.94, and might well have
> different
> >> steps required to set up security. If you are trying out CDH HBase, be
> >> sure
> >> to use up-to-date documentation for your release.
> >>
> >> Let us know how it goes.
> >>
> >> [1] https://hbase.apache.org/book.html#hbase.secure.configuration
> >>
> >> -Dima
> >>
> >> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <aneela@platalytics.com
> >
> >> wrote:
> >>
> >> > Hi Dima,
> >> >
> >> > I'm running Hbase version 1.2.2
> >> >
> >> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <ds...@cloudera.com>
> >> wrote:
> >> >
> >> > > Hi Aneela,
> >> > >
> >> > > What version of HBase are you running?
> >> > >
> >> > > -Dima
> >> > >
> >> > > On Thursday, July 28, 2016, Aneela Saleem <an...@platalytics.com>
> >> > wrote:
> >> > >
> >> > > > Hi,
> >> > > >
> >> > > > I have successfully configured Zookeeper with Kerberos
> >> authentication.
> >> > > Now
> >> > > > i'm facing issue while configuring HBase with Kerberos
> >> authentication.
> >> > I
> >> > > > have followed this link
> >> > > > <
> >> > >
> >> >
> >>
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> >> > > >.
> >> > > > Attached are the configuration files, i.e., hbase-site.xml and
> >> > > > zk-jaas.conf.
> >> > > >
> >> > > > Following are the logs from regionserver:
> >> > > >
> >> > > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> >> > > > 192.168.23.206:16020] regionserver.HRegionServer: error telling
> >> master
> >> > > we
> >> > > > are up
> >> > > > com.google.protobuf.ServiceException: java.io.IOException: Could
> not
> >> > set
> >> > > > up IO Streams to hadoop-master/192.168.23.206:16000
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> >> > > > at java.lang.Thread.run(Thread.java:745)
> >> > > > Caused by: java.io.IOException: Could not set up IO Streams to
> >> > > > hadoop-master/192.168.23.206:16000
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> >> > > > at
> >> > >
> >> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> >> > > > ... 5 more
> >> > > > Caused by: java.lang.RuntimeException: SASL authentication failed.
> >> The
> >> > > > most likely cause is missing or invalid credentials. Consider
> >> 'kinit'.
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> >> > > > at java.security.AccessController.doPrivileged(Native Method)
> >> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> >> > > > ... 9 more
> >> > > > Caused by: javax.security.sasl.SaslException: GSS initiate failed
> >> > [Caused
> >> > > > by GSSException: No valid credentials provided (Mechanism level:
> >> Failed
> >> > > to
> >> > > > find any Kerberos tgt)]
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> >> > > > at java.security.AccessController.doPrivileged(Native Method)
> >> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> >> > > > ... 9 more
> >> > > > Caused by: GSSException: No valid credentials provided (Mechanism
> >> > level:
> >> > > > Failed to find any Kerberos tgt)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> >> > > > at
> >> > >
> >> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> >> > > > at
> >> > >
> >> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> >> > > > at
> >> > > >
> >> > >
> >> >
> >>
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> >> > > >
> >> > > >
> >> > > > Please have a look, whats going wrong here?
> >> > > >
> >> > > > Thanks
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> >
> >
>
>
> --
> -Dima
>

Re: issue starting regionserver with SASL authentication failed

Posted by Dima Spivak <ds...@cloudera.com>.
The stacktrace suggests you don't have a ticket-granting ticket. Have you
kinit'd as the service user?

-Dima

On Sun, Jul 31, 2016 at 11:19 PM, Aneela Saleem <an...@platalytics.com>
wrote:

> Hi Dima,
>
> I followed the official reference guide now, but still same error.
> Attached is the hbase-site.xml file, please have a look. What's wrong there?
>
> On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak <ds...@cloudera.com>
> wrote:
>
>> I haven't looked in detail at your hbase-site.xml, but if you're running
>> Apache HBase (and not a CDH release), I might recommend using the official
>> reference guide [1] to configure your cluster instead of the CDH 4.2.0
>> docs
>> since those would correspond to HBase 0.94, and might well have different
>> steps required to set up security. If you are trying out CDH HBase, be
>> sure
>> to use up-to-date documentation for your release.
>>
>> Let us know how it goes.
>>
>> [1] https://hbase.apache.org/book.html#hbase.secure.configuration
>>
>> -Dima
>>
>> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <an...@platalytics.com>
>> wrote:
>>
>> > Hi Dima,
>> >
>> > I'm running Hbase version 1.2.2
>> >
>> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <ds...@cloudera.com>
>> wrote:
>> >
>> > > Hi Aneela,
>> > >
>> > > What version of HBase are you running?
>> > >
>> > > -Dima
>> > >
>> > > On Thursday, July 28, 2016, Aneela Saleem <an...@platalytics.com>
>> > wrote:
>> > >
>> > > > Hi,
>> > > >
>> > > > I have successfully configured Zookeeper with Kerberos
>> authentication.
>> > > Now
>> > > > i'm facing issue while configuring HBase with Kerberos
>> authentication.
>> > I
>> > > > have followed this link
>> > > > <
>> > >
>> >
>> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
>> > > >.
>> > > > Attached are the configuration files, i.e., hbase-site.xml and
>> > > > zk-jaas.conf.
>> > > >
>> > > > Following are the logs from regionserver:
>> > > >
>> > > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
>> > > > 192.168.23.206:16020] regionserver.HRegionServer: error telling
>> master
>> > > we
>> > > > are up
>> > > > com.google.protobuf.ServiceException: java.io.IOException: Could not
>> > set
>> > > > up IO Streams to hadoop-master/192.168.23.206:16000
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
>> > > > at java.lang.Thread.run(Thread.java:745)
>> > > > Caused by: java.io.IOException: Could not set up IO Streams to
>> > > > hadoop-master/192.168.23.206:16000
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
>> > > > at
>> > >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
>> > > > ... 5 more
>> > > > Caused by: java.lang.RuntimeException: SASL authentication failed.
>> The
>> > > > most likely cause is missing or invalid credentials. Consider
>> 'kinit'.
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
>> > > > at java.security.AccessController.doPrivileged(Native Method)
>> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
>> > > > ... 9 more
>> > > > Caused by: javax.security.sasl.SaslException: GSS initiate failed
>> > [Caused
>> > > > by GSSException: No valid credentials provided (Mechanism level:
>> Failed
>> > > to
>> > > > find any Kerberos tgt)]
>> > > > at
>> > > >
>> > >
>> >
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
>> > > > at java.security.AccessController.doPrivileged(Native Method)
>> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>> > > > at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
>> > > > ... 9 more
>> > > > Caused by: GSSException: No valid credentials provided (Mechanism
>> > level:
>> > > > Failed to find any Kerberos tgt)
>> > > > at
>> > > >
>> > >
>> >
>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>> > > > at
>> > > >
>> > >
>> >
>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
>> > > > at
>> > > >
>> > >
>> >
>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>> > > > at
>> > > >
>> > >
>> >
>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
>> > > > at
>> > >
>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>> > > > at
>> > >
>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>> > > > at
>> > > >
>> > >
>> >
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>> > > >
>> > > >
>> > > > Please have a look, whats going wrong here?
>> > > >
>> > > > Thanks
>> > > >
>> > > >
>> > >
>> >
>>
>
>


-- 
-Dima

Re: issue starting regionserver with SASL authentication failed

Posted by Aneela Saleem <an...@platalytics.com>.
Hi Dima,

I followed the official reference guide now, but still same error. Attached
is the hbase-site.xml file, please have a look. What's wrong there?

On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak <ds...@cloudera.com> wrote:

> I haven't looked in detail at your hbase-site.xml, but if you're running
> Apache HBase (and not a CDH release), I might recommend using the official
> reference guide [1] to configure your cluster instead of the CDH 4.2.0 docs
> since those would correspond to HBase 0.94, and might well have different
> steps required to set up security. If you are trying out CDH HBase, be sure
> to use up-to-date documentation for your release.
>
> Let us know how it goes.
>
> [1] https://hbase.apache.org/book.html#hbase.secure.configuration
>
> -Dima
>
> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <an...@platalytics.com>
> wrote:
>
> > Hi Dima,
> >
> > I'm running Hbase version 1.2.2
> >
> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <ds...@cloudera.com>
> wrote:
> >
> > > Hi Aneela,
> > >
> > > What version of HBase are you running?
> > >
> > > -Dima
> > >
> > > On Thursday, July 28, 2016, Aneela Saleem <an...@platalytics.com>
> > wrote:
> > >
> > > > Hi,
> > > >
> > > > I have successfully configured Zookeeper with Kerberos
> authentication.
> > > Now
> > > > i'm facing issue while configuring HBase with Kerberos
> authentication.
> > I
> > > > have followed this link
> > > > <
> > >
> >
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> > > >.
> > > > Attached are the configuration files, i.e., hbase-site.xml and
> > > > zk-jaas.conf.
> > > >
> > > > Following are the logs from regionserver:
> > > >
> > > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > > > 192.168.23.206:16020] regionserver.HRegionServer: error telling
> master
> > > we
> > > > are up
> > > > com.google.protobuf.ServiceException: java.io.IOException: Could not
> > set
> > > > up IO Streams to hadoop-master/192.168.23.206:16000
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > > > at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: Could not set up IO Streams to
> > > > hadoop-master/192.168.23.206:16000
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > > > at
> > > org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > > > ... 5 more
> > > > Caused by: java.lang.RuntimeException: SASL authentication failed.
> The
> > > > most likely cause is missing or invalid credentials. Consider
> 'kinit'.
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > > > ... 9 more
> > > > Caused by: javax.security.sasl.SaslException: GSS initiate failed
> > [Caused
> > > > by GSSException: No valid credentials provided (Mechanism level:
> Failed
> > > to
> > > > find any Kerberos tgt)]
> > > > at
> > > >
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > > > ... 9 more
> > > > Caused by: GSSException: No valid credentials provided (Mechanism
> > level:
> > > > Failed to find any Kerberos tgt)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > > > at
> > >
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > > > at
> > >
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > > > at
> > > >
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> > > >
> > > >
> > > > Please have a look, whats going wrong here?
> > > >
> > > > Thanks
> > > >
> > > >
> > >
> >
>

Re: issue starting regionserver with SASL authentication failed

Posted by Dima Spivak <ds...@cloudera.com>.
I haven't looked in detail at your hbase-site.xml, but if you're running
Apache HBase (and not a CDH release), I might recommend using the official
reference guide [1] to configure your cluster instead of the CDH 4.2.0 docs
since those would correspond to HBase 0.94, and might well have different
steps required to set up security. If you are trying out CDH HBase, be sure
to use up-to-date documentation for your release.

Let us know how it goes.

[1] https://hbase.apache.org/book.html#hbase.secure.configuration

-Dima

On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <an...@platalytics.com>
wrote:

> Hi Dima,
>
> I'm running Hbase version 1.2.2
>
> On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <ds...@cloudera.com> wrote:
>
> > Hi Aneela,
> >
> > What version of HBase are you running?
> >
> > -Dima
> >
> > On Thursday, July 28, 2016, Aneela Saleem <an...@platalytics.com>
> wrote:
> >
> > > Hi,
> > >
> > > I have successfully configured Zookeeper with Kerberos authentication.
> > Now
> > > i'm facing issue while configuring HBase with Kerberos authentication.
> I
> > > have followed this link
> > > <
> >
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> > >.
> > > Attached are the configuration files, i.e., hbase-site.xml and
> > > zk-jaas.conf.
> > >
> > > Following are the logs from regionserver:
> > >
> > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > > 192.168.23.206:16020] regionserver.HRegionServer: error telling master
> > we
> > > are up
> > > com.google.protobuf.ServiceException: java.io.IOException: Could not
> set
> > > up IO Streams to hadoop-master/192.168.23.206:16000
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > > at
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > > at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: Could not set up IO Streams to
> > > hadoop-master/192.168.23.206:16000
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > > at
> > org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > > ... 5 more
> > > Caused by: java.lang.RuntimeException: SASL authentication failed. The
> > > most likely cause is missing or invalid credentials. Consider 'kinit'.
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > at
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > > ... 9 more
> > > Caused by: javax.security.sasl.SaslException: GSS initiate failed
> [Caused
> > > by GSSException: No valid credentials provided (Mechanism level: Failed
> > to
> > > find any Kerberos tgt)]
> > > at
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > > at
> > >
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > at
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > > ... 9 more
> > > Caused by: GSSException: No valid credentials provided (Mechanism
> level:
> > > Failed to find any Kerberos tgt)
> > > at
> > >
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > > at
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > > at
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > > at
> > >
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > > at
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > > at
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > > at
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> > >
> > >
> > > Please have a look, whats going wrong here?
> > >
> > > Thanks
> > >
> > >
> >
>

Re: issue starting regionserver with SASL authentication failed

Posted by Aneela Saleem <an...@platalytics.com>.
Hi Dima,

I'm running Hbase version 1.2.2

On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <ds...@cloudera.com> wrote:

> Hi Aneela,
>
> What version of HBase are you running?
>
> -Dima
>
> On Thursday, July 28, 2016, Aneela Saleem <an...@platalytics.com> wrote:
>
> > Hi,
> >
> > I have successfully configured Zookeeper with Kerberos authentication.
> Now
> > i'm facing issue while configuring HBase with Kerberos authentication. I
> > have followed this link
> > <
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> >.
> > Attached are the configuration files, i.e., hbase-site.xml and
> > zk-jaas.conf.
> >
> > Following are the logs from regionserver:
> >
> > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > 192.168.23.206:16020] regionserver.HRegionServer: error telling master
> we
> > are up
> > com.google.protobuf.ServiceException: java.io.IOException: Could not set
> > up IO Streams to hadoop-master/192.168.23.206:16000
> > at
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > at
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > at
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: Could not set up IO Streams to
> > hadoop-master/192.168.23.206:16000
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > at
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > at
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > ... 5 more
> > Caused by: java.lang.RuntimeException: SASL authentication failed. The
> > most likely cause is missing or invalid credentials. Consider 'kinit'.
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:415)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > ... 9 more
> > Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
> > by GSSException: No valid credentials provided (Mechanism level: Failed
> to
> > find any Kerberos tgt)]
> > at
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > at
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:415)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > ... 9 more
> > Caused by: GSSException: No valid credentials provided (Mechanism level:
> > Failed to find any Kerberos tgt)
> > at
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > at
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > at
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > at
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > at
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > at
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> >
> >
> > Please have a look, whats going wrong here?
> >
> > Thanks
> >
> >
>

Re: issue starting regionserver with SASL authentication failed

Posted by Dima Spivak <ds...@cloudera.com>.
Hi Aneela,

What version of HBase are you running?

-Dima

On Thursday, July 28, 2016, Aneela Saleem <an...@platalytics.com> wrote:

> Hi,
>
> I have successfully configured Zookeeper with Kerberos authentication. Now
> i'm facing issue while configuring HBase with Kerberos authentication. I
> have followed this link
> <http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html>.
> Attached are the configuration files, i.e., hbase-site.xml and
> zk-jaas.conf.
>
> Following are the logs from regionserver:
>
> 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> 192.168.23.206:16020] regionserver.HRegionServer: error telling master we
> are up
> com.google.protobuf.ServiceException: java.io.IOException: Could not set
> up IO Streams to hadoop-master/192.168.23.206:16000
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> at
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: Could not set up IO Streams to
> hadoop-master/192.168.23.206:16000
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> ... 5 more
> Caused by: java.lang.RuntimeException: SASL authentication failed. The
> most likely cause is missing or invalid credentials. Consider 'kinit'.
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> ... 9 more
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused
> by GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> at
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> at
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> ... 9 more
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
> at
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> at
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> at
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> at
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
>
>
> Please have a look, whats going wrong here?
>
> Thanks
>
>