You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Srinivas Chamarthi <sr...@gmail.com> on 2013/10/18 07:52:34 UTC

Master shutting down with exception

I am using HBase 0.95-2 with Hadoop 2.1.0-beta and the master is shutting
down with the below exception.



java.io.IOException: Failed on local exception:
com.google.protobuf.InvalidProtocolBufferException: Message missing
required fields: callId, status; Host Details : local host is:
"xxxxxxx-.localdomain/xxxxxxxx"; destination host is:
"xxxxxx.localdomain":8020;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
        at org.apache.hadoop.ipc.Client.call(Client.java:1239)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
        at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
        at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
        at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:540)
        at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2008)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:693)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:677)
        at
org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:427)
        at
org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:846)
        at
org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:436)
        at
org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147)
        at
org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131)
        at
org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:761)
        at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:578)
        at java.lang.Thread.run(Thread.java:662)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
missing required fields: callId, status
        at
com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
        at
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
        at
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
        at
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
        at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)


thx
srinivas

Re: Master shutting down with exception

Posted by Srinivas Chamarthi <sr...@gmail.com>.
Will do it. Thanks so much 

Sent from my iPhone

> On Oct 18, 2013, at 9:44 AM, Ted Yu <yu...@gmail.com> wrote:
> 
> There have been new releases for both projects.
> 
> Can you go with HBase 0.96.0 RC5 and hadoop 2.2 ?
> 
> Cheers
> 
> 
> On Thu, Oct 17, 2013 at 10:52 PM, Srinivas Chamarthi <
> srinivas.chamarthi@gmail.com> wrote:
> 
>> I am using HBase 0.95-2 with Hadoop 2.1.0-beta and the master is shutting
>> down with the below exception.
>> 
>> 
>> 
>> java.io.IOException: Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Message missing
>> required fields: callId, status; Host Details : local host is:
>> "xxxxxxx-.localdomain/xxxxxxxx"; destination host is:
>> "xxxxxx.localdomain":8020;
>>        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
>>        at org.apache.hadoop.ipc.Client.call(Client.java:1239)
>>        at
>> 
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>>        at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>        at
>> 
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>        at
>> 
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>        at java.lang.reflect.Method.invoke(Method.java:597)
>>        at
>> 
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>>        at
>> 
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>>        at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
>>        at
>> 
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:540)
>>        at
>> org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2008)
>>        at
>> 
>> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:693)
>>        at
>> 
>> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:677)
>>        at
>> org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:427)
>>        at
>> org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:846)
>>        at
>> 
>> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:436)
>>        at
>> 
>> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147)
>>        at
>> 
>> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131)
>>        at
>> 
>> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:761)
>>        at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:578)
>>        at java.lang.Thread.run(Thread.java:662)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
>> missing required fields: callId, status
>>        at
>> 
>> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>>        at
>> 
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>>        at
>> 
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>>        at
>> 
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>>        at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
>> 
>> 
>> thx
>> srinivas
>> 

Re: Master shutting down with exception

Posted by Ted Yu <yu...@gmail.com>.
There have been new releases for both projects.

Can you go with HBase 0.96.0 RC5 and hadoop 2.2 ?

Cheers


On Thu, Oct 17, 2013 at 10:52 PM, Srinivas Chamarthi <
srinivas.chamarthi@gmail.com> wrote:

> I am using HBase 0.95-2 with Hadoop 2.1.0-beta and the master is shutting
> down with the below exception.
>
>
>
> java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Message missing
> required fields: callId, status; Host Details : local host is:
> "xxxxxxx-.localdomain/xxxxxxxx"; destination host is:
> "xxxxxx.localdomain":8020;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1239)
>         at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
>         at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:540)
>         at
> org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2008)
>         at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:693)
>         at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:677)
>         at
> org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:427)
>         at
> org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:846)
>         at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:436)
>         at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147)
>         at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131)
>         at
>
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:761)
>         at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:578)
>         at java.lang.Thread.run(Thread.java:662)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> missing required fields: callId, status
>         at
>
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>         at
>
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>         at
>
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>         at
>
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
>
>
> thx
> srinivas
>