You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Arthur Chan <ar...@gmail.com> on 2015/06/17 14:52:04 UTC

HBase Start Error: java.io.IOException: com.google.protobuf.ServiceException: java.lang.NoSuchMethodError:

Hi,


my Hbase: 0.98.11
and Protobul: 2.6.0   *** cannot use 2.5.0, cz my machine is ppc64le and
uname -m
    ppc64le


I m trying hbase 0.98.11 on ppc64le, using protobuf 2.6.0, the hadoop  here
(2.6.0) is running smoothly
When trying to start HMaster, I got error as follows:


Issue 1:  hbase shell

hbase shell

io/console not supported; tty will not be manipulated

include_class is deprecated. Use java_import.

include_class is deprecated. Use java_import.

include_class is deprecated. Use java_import.

2015-06-17 20:48:04,876 INFO  [main] Configuration.deprecation:
hadoop.native.lib is deprecated. Instead, use io.native.lib.available

NoMethodError: undefined method `getTerminal' for
Java::Jline::Terminal:Module

  refresh_width at /hbase/lib/ruby/shell/formatter.rb:33

     initialize at /hbase/lib/ruby/shell/formatter.rb:46

         (root) at /hbase/bin/hirb.rb:115


Issue 2: from the hbase master log
2015-06-17 20:36:49,150 DEBUG [main-EventThread]
master.ActiveMasterManager: A master is now available

2015-06-17 20:36:49,155 INFO  [master:master:60000]
Configuration.deprecation: fs.default.name is deprecated. Instead, use
fs.defaultFS

2015-06-17 20:36:49,285 INFO  [master:master:60000]
retry.RetryInvocationHandler: Exception while invoking getBlockLocations of
class ClientNamenodeProtocolTranslatorPB over master/10.10.10.1:8020.
Trying to fail over immediately.

java.io.IOException: com.google.protobuf.ServiceException:
java.lang.NoSuchMethodError:
com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;

at
org.apache.hadoop.ipc.ProtobufHelper.getRemoteException(ProtobufHelper.java:47)

at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:259)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)

at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)

at com.sun.proxy.$Proxy16.getBlockLocations(Unknown Source)

at
org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1220)

at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1210)

at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1200)

at
org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:271)

at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:238)

at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:231)

at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1498)

at
org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:303)

at
org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:298)

at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

at
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:311)

at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:766)

at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:509)

at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:595)

at
org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:462)

at
org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153)

at
org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:129)

at
org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:880)

at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:683)

at java.lang.Thread.run(Thread.java:745)

Caused by: com.google.protobuf.ServiceException:
java.lang.NoSuchMethodError:
com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;

at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:274)

at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source)

at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:254)

... 27 more

Caused by: java.lang.NoSuchMethodError:
com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto.<init>(HdfsProtos.java:13128)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto.<init>(HdfsProtos.java:12954)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto$1.parsePartialFrom(HdfsProtos.java:13152)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto$1.parsePartialFrom(HdfsProtos.java:13147)

at
com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto.<init>(HdfsProtos.java:19527)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto.<init>(HdfsProtos.java:19468)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto$1.parsePartialFrom(HdfsProtos.java:19599)

at
org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto$1.parsePartialFrom(HdfsProtos.java:19594)

at
com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)

at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto.<init>(ClientNamenodeProtocolProtos.java:1409)

at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto.<init>(ClientNamenodeProtocolProtos.java:1355)

at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$1.parsePartialFrom(ClientNamenodeProtocolProtos.java:1447)

at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$1.parsePartialFrom(ClientNamenodeProtocolProtos.java:1442)

at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$Builder.mergeFrom(ClientNamenodeProtocolProtos.java:1752)

at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$Builder.mergeFrom(ClientNamenodeProtocolProtos.java:1634)

at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:337)

at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)

at
com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:170)

at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:882)

at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)

at
com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:161)

at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:875)

at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)

at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:265)

... 29 more


Please help.

Regards

Re: HBase Start Error: java.io.IOException: com.google.protobuf.ServiceException: java.lang.NoSuchMethodError:

Posted by Esteban Gutierrez <es...@cloudera.com>.
Hello Arthur,

Our apologies for the delay here. Why you cannot use protobuf 2.5? I don't
see anything special in that version  would cause an issue with ppc64le.
Worst case, assuming you are using Linux on that architecture, there is a
protobuf compiler for ppc64el in rpmfind:
http://rpmfind.net/linux/rpm2html/search.php?query=protobuf-compiler(ppc-64)

Regarding the other issue, seems that there are some known issues around
JRuby on ppc64el, see: https://github.com/jruby/jruby/issues/2790

cheers,
esteban.




--
Cloudera, Inc.


On Wed, Jun 17, 2015 at 5:52 AM, Arthur Chan <ar...@gmail.com>
wrote:

> Hi,
>
>
> my Hbase: 0.98.11
> and Protobul: 2.6.0   *** cannot use 2.5.0, cz my machine is ppc64le and
> uname -m
>     ppc64le
>
>
> I m trying hbase 0.98.11 on ppc64le, using protobuf 2.6.0, the hadoop  here
> (2.6.0) is running smoothly
> When trying to start HMaster, I got error as follows:
>
>
> Issue 1:  hbase shell
>
> hbase shell
>
> io/console not supported; tty will not be manipulated
>
> include_class is deprecated. Use java_import.
>
> include_class is deprecated. Use java_import.
>
> include_class is deprecated. Use java_import.
>
> 2015-06-17 20:48:04,876 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>
> NoMethodError: undefined method `getTerminal' for
> Java::Jline::Terminal:Module
>
>   refresh_width at /hbase/lib/ruby/shell/formatter.rb:33
>
>      initialize at /hbase/lib/ruby/shell/formatter.rb:46
>
>          (root) at /hbase/bin/hirb.rb:115
>
>
> Issue 2: from the hbase master log
> 2015-06-17 20:36:49,150 DEBUG [main-EventThread]
> master.ActiveMasterManager: A master is now available
>
> 2015-06-17 20:36:49,155 INFO  [master:master:60000]
> Configuration.deprecation: fs.default.name is deprecated. Instead, use
> fs.defaultFS
>
> 2015-06-17 20:36:49,285 INFO  [master:master:60000]
> retry.RetryInvocationHandler: Exception while invoking getBlockLocations of
> class ClientNamenodeProtocolTranslatorPB over master/10.10.10.1:8020.
> Trying to fail over immediately.
>
> java.io.IOException: com.google.protobuf.ServiceException:
> java.lang.NoSuchMethodError:
>
> com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;
>
> at
>
> org.apache.hadoop.ipc.ProtobufHelper.getRemoteException(ProtobufHelper.java:47)
>
> at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:259)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:483)
>
> at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>
> at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>
> at com.sun.proxy.$Proxy16.getBlockLocations(Unknown Source)
>
> at
> org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1220)
>
> at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1210)
>
> at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1200)
>
> at
>
> org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:271)
>
> at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:238)
>
> at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:231)
>
> at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1498)
>
> at
>
> org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:303)
>
> at
>
> org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:298)
>
> at
>
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
> at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:311)
>
> at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:766)
>
> at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:509)
>
> at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:595)
>
> at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:462)
>
> at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153)
>
> at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:129)
>
> at
>
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:880)
>
> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:683)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: com.google.protobuf.ServiceException:
> java.lang.NoSuchMethodError:
>
> com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;
>
> at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:274)
>
> at com.sun.proxy.$Proxy15.getBlockLocations(Unknown Source)
>
> at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:254)
>
> ... 27 more
>
> Caused by: java.lang.NoSuchMethodError:
>
> com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto.<init>(HdfsProtos.java:13128)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto.<init>(HdfsProtos.java:12954)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto$1.parsePartialFrom(HdfsProtos.java:13152)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlockProto$1.parsePartialFrom(HdfsProtos.java:13147)
>
> at
> com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto.<init>(HdfsProtos.java:19527)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto.<init>(HdfsProtos.java:19468)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto$1.parsePartialFrom(HdfsProtos.java:19599)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$LocatedBlocksProto$1.parsePartialFrom(HdfsProtos.java:19594)
>
> at
> com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto.<init>(ClientNamenodeProtocolProtos.java:1409)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto.<init>(ClientNamenodeProtocolProtos.java:1355)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$1.parsePartialFrom(ClientNamenodeProtocolProtos.java:1447)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$1.parsePartialFrom(ClientNamenodeProtocolProtos.java:1442)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$Builder.mergeFrom(ClientNamenodeProtocolProtos.java:1752)
>
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetBlockLocationsResponseProto$Builder.mergeFrom(ClientNamenodeProtocolProtos.java:1634)
>
> at
>
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:337)
>
> at
>
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)
>
> at
>
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:170)
>
> at
>
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:882)
>
> at
>
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)
>
> at
>
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:161)
>
> at
>
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:875)
>
> at
>
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)
>
> at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:265)
>
> ... 29 more
>
>
> Please help.
>
> Regards
>