You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Konrad Tendera <em...@tendera.eu> on 2012/04/17 14:08:20 UTC

HBase Security Configuration

Hello,
I'm trying to configure secure HBase using following instruction: https://ccp.cloudera.com/display/CDHDOC/HBase+Security+Configuration. Our cluster uses Kerberos and everything in Hadoop work fine. But when I start HBase following exception is thrown 

FATAL org.apache.hadoop.hbase.master.HMaster: Unhandled exception. Starting shutdown.
org.apache.hadoop.security.AccessControlException: Authentication is required
	at org.apache.hadoop.ipc.Client.call(Client.java:1028)
	at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:198)
	at $Proxy9.getProtocolVersion(Unknown Source)
	at org.apache.hadoop.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:235)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:275)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:249)
	at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:161)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:278)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:109)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1792)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:76)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1826)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1808)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:265)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:189)
	at org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:471)
	at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:94)
	at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:448)
	at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:326)
	at java.lang.Thread.run(Thread.java:662)

I can't find any info about it. I'm using Hbase 0.92 with Hadoop 0.22

-- 
Konrad Tendera

Re: HBase Security Configuration

Posted by Harsh J <ha...@cloudera.com>.
Hey Konrad,

Make sure your HBase's classpath also has the Hadoop conf dir on it
(specifically hdfs-site.xml and core-site.xml). It it already does
have that, make sure they are populated with the right HDFS cluster
values (core-site needs two properties that toggle security ON, and
hdfs-site needs the HDFS server principals configured inside it -
basically just copy these core-site and hdfs-site files from your
secured HDFS cluster config over to the HBase machines/classpath).

On Tue, Apr 17, 2012 at 5:38 PM, Konrad Tendera <em...@tendera.eu> wrote:
> Hello,
> I'm trying to configure secure HBase using following instruction: https://ccp.cloudera.com/display/CDHDOC/HBase+Security+Configuration. Our cluster uses Kerberos and everything in Hadoop work fine. But when I start HBase following exception is thrown
>
> FATAL org.apache.hadoop.hbase.master.HMaster: Unhandled exception. Starting shutdown.
> org.apache.hadoop.security.AccessControlException: Authentication is required
>        at org.apache.hadoop.ipc.Client.call(Client.java:1028)
>        at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:198)
>        at $Proxy9.getProtocolVersion(Unknown Source)
>        at org.apache.hadoop.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:235)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:275)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:249)
>        at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:161)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:278)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
>        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:109)
>        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1792)
>        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:76)
>        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1826)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1808)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:265)
>        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:189)
>        at org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:471)
>        at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:94)
>        at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:448)
>        at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:326)
>        at java.lang.Thread.run(Thread.java:662)
>
> I can't find any info about it. I'm using Hbase 0.92 with Hadoop 0.22
>
> --
> Konrad Tendera



-- 
Harsh J