You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Benjamin Ross <br...@Lattice-Engines.com> on 2016/10/11 15:38:52 UTC

Authentication Failure talking to Ranger KMS

All,
I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.

Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?


2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
  effectiveUser: "root"
  realUser: "httpfs"
}
protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"

2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
, OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
e: Forbidden
2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
        ... 15 more
Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        ... 23 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)



This message has been scanned for malware by Websense. www.websense.com

RE: Authentication Failure talking to Ranger KMS

Posted by Benjamin Ross <br...@Lattice-Engines.com>.
Just for kicks I tried applying the patch in that ticket and it didn't have any effect.  It makes sense because my issue is on CREATE, and the ticket only has to do with OPEN.

Note that I don't have these issues using WebHDFS, only using httpfs, so it definitely seems like we're on the right track...

Thanks in advance,
Ben



________________________________
From: Benjamin Ross
Sent: Tuesday, October 11, 2016 12:02 PM
To: Wei-Chiu Chuang
Cc: user@hadoop.apache.org; user@ranger.incubator.apache.org
Subject: RE: Authentication Failure talking to Ranger KMS

That seems promising.  But shouldn't I be able to work around it by just ensuring that httpfs has all necessary privileges in the KMS service under Ranger?

Thanks,
Ben


________________________________
From: Wei-Chiu Chuang [weichiu@cloudera.com]
Sent: Tuesday, October 11, 2016 11:57 AM
To: Benjamin Ross
Cc: user@hadoop.apache.org; user@ranger.incubator.apache.org
Subject: Re: Authentication Failure talking to Ranger KMS

Somes to me you encountered this bug? HDFS-10481<https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com>> wrote:

All,
I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.

Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?


2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
  effectiveUser: "root"
  realUser: "httpfs"
}
protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"

2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
, OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
e: Forbidden
2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
        ... 15 more
Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        ... 23 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)




This message has been scanned for malware by Websense.  www.websense.com<http://www.websense.com/>




Click here<https://www.mailcontrol.com/sr/MZbqvYs5QwJvpeaetUwhCQ==> to report this email as spam.

RE: Authentication Failure talking to Ranger KMS

Posted by Benjamin Ross <br...@Lattice-Engines.com>.
Just for kicks I tried applying the patch in that ticket and it didn't have any effect.  It makes sense because my issue is on CREATE, and the ticket only has to do with OPEN.

Note that I don't have these issues using WebHDFS, only using httpfs, so it definitely seems like we're on the right track...

Thanks in advance,
Ben



________________________________
From: Benjamin Ross
Sent: Tuesday, October 11, 2016 12:02 PM
To: Wei-Chiu Chuang
Cc: user@hadoop.apache.org; user@ranger.incubator.apache.org
Subject: RE: Authentication Failure talking to Ranger KMS

That seems promising.  But shouldn't I be able to work around it by just ensuring that httpfs has all necessary privileges in the KMS service under Ranger?

Thanks,
Ben


________________________________
From: Wei-Chiu Chuang [weichiu@cloudera.com]
Sent: Tuesday, October 11, 2016 11:57 AM
To: Benjamin Ross
Cc: user@hadoop.apache.org; user@ranger.incubator.apache.org
Subject: Re: Authentication Failure talking to Ranger KMS

Somes to me you encountered this bug? HDFS-10481<https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com>> wrote:

All,
I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.

Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?


2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
  effectiveUser: "root"
  realUser: "httpfs"
}
protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"

2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
, OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
e: Forbidden
2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
        ... 15 more
Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        ... 23 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)




This message has been scanned for malware by Websense.  www.websense.com<http://www.websense.com/>




Click here<https://www.mailcontrol.com/sr/MZbqvYs5QwJvpeaetUwhCQ==> to report this email as spam.

Re: Authentication Failure talking to Ranger KMS

Posted by Velmurugan Periasamy <vp...@hortonworks.com>.
Is httpfs user configured to proxy as other users?

You can see if there are any clues in KMS log or audit log.

From: Benjamin Ross <br...@Lattice-Engines.com>>
Reply-To: "user@ranger.incubator.apache.org<ma...@ranger.incubator.apache.org>" <us...@ranger.incubator.apache.org>>
Date: Tuesday, October 11, 2016 at 9:02 AM
To: Wei-Chiu Chuang <we...@cloudera.com>>
Cc: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>, "user@ranger.incubator.apache.org<ma...@ranger.incubator.apache.org>" <us...@ranger.incubator.apache.org>>
Subject: RE: Authentication Failure talking to Ranger KMS

That seems promising.  But shouldn't I be able to work around it by just ensuring that httpfs has all necessary privileges in the KMS service under Ranger?

Thanks,
Ben


________________________________
From: Wei-Chiu Chuang [weichiu@cloudera.com<ma...@cloudera.com>]
Sent: Tuesday, October 11, 2016 11:57 AM
To: Benjamin Ross
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>; user@ranger.incubator.apache.org<ma...@ranger.incubator.apache.org>
Subject: Re: Authentication Failure talking to Ranger KMS

Somes to me you encountered this bug? HDFS-10481<https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com>> wrote:

All,
I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.

Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?


2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
  effectiveUser: "root"
  realUser: "httpfs"
}
protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"

2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
, OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
e: Forbidden
2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
        ... 15 more
Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        ... 23 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)




This message has been scanned for malware by Websense.  www.websense.com<http://www.websense.com/>




Click here<https://www.mailcontrol.com/sr/qw!w!nwY5mPGX2PQPOmvUrTWOQPRvz4DDKKwdL6F8P073kVeWPd1at6WCPjncFgd56g2FV6ulIuOdoKZwoXR0w==> to report this email as spam.

RE: Authentication Failure talking to Ranger KMS

Posted by Benjamin Ross <br...@Lattice-Engines.com>.
That seems promising.  But shouldn't I be able to work around it by just ensuring that httpfs has all necessary privileges in the KMS service under Ranger?

Thanks,
Ben


________________________________
From: Wei-Chiu Chuang [weichiu@cloudera.com]
Sent: Tuesday, October 11, 2016 11:57 AM
To: Benjamin Ross
Cc: user@hadoop.apache.org; user@ranger.incubator.apache.org
Subject: Re: Authentication Failure talking to Ranger KMS

Somes to me you encountered this bug? HDFS-10481<https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com>> wrote:

All,
I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.

Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?


2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
  effectiveUser: "root"
  realUser: "httpfs"
}
protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"

2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
, OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
e: Forbidden
2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
        ... 15 more
Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        ... 23 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)




This message has been scanned for malware by Websense.  www.websense.com<http://www.websense.com/>




Click here<https://www.mailcontrol.com/sr/qw!w!nwY5mPGX2PQPOmvUrTWOQPRvz4DDKKwdL6F8P073kVeWPd1at6WCPjncFgd56g2FV6ulIuOdoKZwoXR0w==> to report this email as spam.

RE: Authentication Failure talking to Ranger KMS

Posted by Benjamin Ross <br...@Lattice-Engines.com>.
That seems promising.  But shouldn't I be able to work around it by just ensuring that httpfs has all necessary privileges in the KMS service under Ranger?

Thanks,
Ben


________________________________
From: Wei-Chiu Chuang [weichiu@cloudera.com]
Sent: Tuesday, October 11, 2016 11:57 AM
To: Benjamin Ross
Cc: user@hadoop.apache.org; user@ranger.incubator.apache.org
Subject: Re: Authentication Failure talking to Ranger KMS

Somes to me you encountered this bug? HDFS-10481<https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com>> wrote:

All,
I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.

Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?


2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
  effectiveUser: "root"
  realUser: "httpfs"
}
protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"

2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
, OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
e: Forbidden
2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
        ... 15 more
Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
        at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        ... 23 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
        at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)




This message has been scanned for malware by Websense.  www.websense.com<http://www.websense.com/>




Click here<https://www.mailcontrol.com/sr/qw!w!nwY5mPGX2PQPOmvUrTWOQPRvz4DDKKwdL6F8P073kVeWPd1at6WCPjncFgd56g2FV6ulIuOdoKZwoXR0w==> to report this email as spam.

Re: Authentication Failure talking to Ranger KMS

Posted by Wei-Chiu Chuang <we...@cloudera.com>.
Somes to me you encountered this bug? HDFS-10481 <https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

> On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com> wrote:
> 
> All,
> I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.
> 
> Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?
> 
> 
> 2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
>   effectiveUser: "root"
>   realUser: "httpfs"
> }
> protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"
> 
> 2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
> 2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
> 2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
> 2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
> 2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
> , OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
> 2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
> 2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
> 2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
> e: Forbidden
> 2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
> 2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
> e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
> 2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
> java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
>         at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
>         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
>         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
>         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
> Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
>         at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
>         at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
>         at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
>         at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
>         at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
>         at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
>         at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
>         at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
>         at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
>         ... 15 more
> Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
>         at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
>         at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
>         ... 23 more
> Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
> 
> 
> 
> This message has been scanned for malware by Websense.  www.websense.com <http://www.websense.com/>

Re: Authentication Failure talking to Ranger KMS

Posted by Wei-Chiu Chuang <we...@cloudera.com>.
Somes to me you encountered this bug? HDFS-10481 <https://issues.apache.org/jira/browse/HDFS-10481>
If you’re using CDH, this is fixed in CDH5.5.5, CDH5.7.2 and CDH5.8.2

Wei-Chiu Chuang
A very happy Clouderan

> On Oct 11, 2016, at 8:38 AM, Benjamin Ross <br...@Lattice-Engines.com> wrote:
> 
> All,
> I'm trying to use httpfs to write to an encryption zone with security off.  I can read from an encryption zone, but I can't write to one.
> 
> Here's the applicable namenode logs.  httpfs and root both have all possible privileges in the KMS.  What am I missing?
> 
> 
> 2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:authorizeConnection(2095)) - Successfully authorized userInfo {
>   effectiveUser: "root"
>   realUser: "httpfs"
> }
> protocol: "org.apache.hadoop.hdfs.protocol.ClientProtocol"
> 
> 2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:processOneRpc(1902)) -  got #2
> 2016-10-07 15:48:16,164 DEBUG ipc.Server (Server.java:run(2179)) - IPC Server handler 9 on 8020: org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER
> 2016-10-07 15:48:16,165 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:root (auth:PROXY) via httpfs (auth:SIMPLE) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
> 2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (NameNodeRpcServer.java:create(699)) - *DIR* NameNode.create: file /tmp/cryptotest/hairyballs for DFSClient_NONMAPREDUCE_-1005188439_28 at 10.41.1.64
> 2016-10-07 15:48:16,166 DEBUG hdfs.StateChange (FSNamesystem.java:startFileInt(2411)) - DIR* NameSystem.startFile: src=/tmp/cryptotest/hairyballs, holder=DFSClient_NONMAPREDUCE_-1005188439_28, clientMachine=10.41.1.64, createParent=true, replication=3, createFlag=[CREATE
> , OVERWRITE], blockSize=134217728, supportedVersions=[CryptoProtocolVersion{description='Encryption zones', version=2, unknownValue=null}]
> 2016-10-07 15:48:16,167 DEBUG security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1751)) - PrivilegedAction as:hdfs (auth:SIMPLE) from:org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:484)
> 2016-10-07 15:48:16,171 DEBUG client.KerberosAuthenticator (KerberosAuthenticator.java:authenticate(205)) - Using fallback authenticator sequence.
> 2016-10-07 15:48:16,176 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, messag
> e: Forbidden
> 2016-10-07 15:48:16,176 DEBUG ipc.Server (ProtobufRpcEngine.java:call(631)) - Served: create queueTime= 2 procesingTime= 10 exception= IOException
> 2016-10-07 15:48:16,177 DEBUG security.UserGroupInformation (UserGroupInformation.java:doAs(1728)) - PrivilegedActionException as:root (auth:PROXY) via httpfs (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apach
> e.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
> 2016-10-07 15:48:16,177 INFO  ipc.Server (Server.java:logException(2299)) - IPC Server handler 9 on 8020, call org.apache.hadoop.hdfs.protocol.ClientProtocol.create from 10.41.1.64:47622 Call#2 Retry#0
> java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:750)
>         at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.generateEncryptedDataEncryptionKey(FSNamesystem.java:2352)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2478)
>         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2377)
>         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:716)
>         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:405)
>         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2211)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2207)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2205)
> Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:289)
>         at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:276)
>         at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:111)
>         at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:132)
>         at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2381)
>         at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2351)
>         at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
>         at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
>         at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
>         at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:266)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.generateEncryptedKey(KMSClientProvider.java:745)
>         ... 15 more
> Caused by: java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:495)
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider.access$100(KMSClientProvider.java:84)
>         at org.apache.hadoop.crypto.key.kms.KMSClientProvider$EncryptedQueueRefiller.fillQueueForKey(KMSClientProvider.java:133)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
>         at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
>         at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
>         ... 23 more
> Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
>         at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
> 
> 
> 
> This message has been scanned for malware by Websense.  www.websense.com <http://www.websense.com/>