You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Margus Roo <ma...@roo.ee> on 2014/12/09 07:45:03 UTC
Use CodedInputStream.setSizeLimit() to increase the size limit on
Read more than 64M content
Hi
I am getting:
org.apache.hadoop.hbase.client.RpcRetryingCaller@4d6a54b0,
java.io.IOException: Call to regionserver1/192.168.81.166:60020 failed
on local exception: com.google.protobuf.InvalidProtocolBufferException:
Protocol message was too large. May be malicious. Use
CodedInputStream.setSizeLimit() to increase the size limit.
I am using Java API
...
// Get image from table
HTablePool pool = new HTablePool(getConf(), 1);
HTableInterface usersTable = pool.getTable(table);
Get g = new Get(Bytes.toBytes(uid));
Result r = usersTable.get(g);
//g.addFamily(Bytes.toBytes("info"));
byte[] b = r.getValue(Bytes.toBytes(cf),Bytes.toBytes("content"));
String image = Bytes.toString(b);
FileOutputStream imageOutFile = new FileOutputStream(outFileName);
imageOutFile.write(Base64.decodeBase64(image));
...
As I understand then com.google.protobuf is limiting more than 64M messages.
Is that related? https://issues.apache.org/jira/browse/HBASE-11747
Is there workaround?
--
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 480
Re: Use CodedInputStream.setSizeLimit() to increase the size limit
on Read more than 64M content
Posted by Margus Roo <ma...@roo.ee>.
One more thing. I can put larger files than 64M into hbase using java
API and there is not problem with message size.
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 480
On 09/12/14 15:53, Margus Roo wrote:
> Hi more additional information
>
> Connecting to bigdata3/192.168.80.232:60020
> IPC Client (155412783) connection to bigdata3/192.168.80.232:60020
> from margusja: wrote request header call_id: 16 method_name: "Get"
> request_param: true
> IPC Client (155412783) connection to bigdata3/192.168.80.232:60020
> from margusja: starting, connections 2
> IPC Client (155412783) connection to bigdata3/192.168.80.232:60020
> from margusja:*got response header call_id: 16, totalSize: 68266970
> bytes*
> IPC Client (155412783) connection to bigdata3/192.168.80.232:60020
> from margusja: closing ipc connection to
> bigdata3.webmedia.int/192.168.80.232:60020: Protocol message was too
> large. May be malicious. Use CodedInputStream.setSizeLimit() to
> increase the size limit.
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> was too large. May be malicious. Use CodedInputStream.setSizeLimit()
> to increase the size limit.
> at
> com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
> at
> com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
> at
> com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
> at
> com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.<init>(ClientProtos.java:4086)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.<init>(ClientProtos.java:4050)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$1.parsePartialFrom(ClientProtos.java:4149)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$1.parsePartialFrom(ClientProtos.java:4144)
> at
> com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse.<init>(ClientProtos.java:5951)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse.<init>(ClientProtos.java:5898)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$1.parsePartialFrom(ClientProtos.java:5989)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$1.parsePartialFrom(ClientProtos.java:5984)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$Builder.mergeFrom(ClientProtos.java:6282)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$Builder.mergeFrom(ClientProtos.java:6171)
> at
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:220)
> at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:912)
> at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)
> at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:290)
> at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:926)
> at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:296)
> at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:918)
> at
> org.apache.hadoop.hbase.ipc.RpcClient$Connection.readResponse(RpcClient.java:1099)
> at
> org.apache.hadoop.hbase.ipc.RpcClient$Connection.run(RpcClient.java:726)
> IPC Client (155412783) connection to bigdata3/192.168.80.232:60020
> from margusja: closed
> IPC Client (155412783) connection to bigdata3/192.168.80.232:60020
> from margusja: stopped, connections 1
> IPC Client (155412783) connection to bigdata2/192.168.80.70:60020 from
> margusja: closed
> IPC Client (155412783) connection to bigdata2/192.168.80.70:60020 from
> margusja: stopped, connections 0
>
> Any hints?
>
> Margus (margusja) Roo
> http://margus.roo.ee
> skype: margusja
> +372 51 480
>
> On 09/12/14 08:45, Margus Roo wrote:
>> Hi
>>
>> I am getting:
>> org.apache.hadoop.hbase.client.RpcRetryingCaller@4d6a54b0,
>> java.io.IOException: Call to regionserver1/192.168.81.166:60020
>> failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> was too large. May be malicious. Use
>> CodedInputStream.setSizeLimit() to increase the size limit.
>>
>> I am using Java API
>> ...
>> // Get image from table
>> HTablePool pool = new HTablePool(getConf(), 1);
>> HTableInterface usersTable = pool.getTable(table);
>>
>> Get g = new Get(Bytes.toBytes(uid));
>> Result r = usersTable.get(g);
>>
>> //g.addFamily(Bytes.toBytes("info"));
>> byte[] b =
>> r.getValue(Bytes.toBytes(cf),Bytes.toBytes("content"));
>> String image = Bytes.toString(b);
>>
>> FileOutputStream imageOutFile = new
>> FileOutputStream(outFileName);
>> imageOutFile.write(Base64.decodeBase64(image));
>> ...
>>
>> As I understand then com.google.protobuf is limiting more than 64M
>> messages.
>> Is that related? https://issues.apache.org/jira/browse/HBASE-11747
>>
>> Is there workaround?
>>
>
>
Re: Use CodedInputStream.setSizeLimit() to increase the size limit
on Read more than 64M content
Posted by Margus Roo <ma...@roo.ee>.
Hi more additional information
Connecting to bigdata3/192.168.80.232:60020
IPC Client (155412783) connection to bigdata3/192.168.80.232:60020 from
margusja: wrote request header call_id: 16 method_name: "Get"
request_param: true
IPC Client (155412783) connection to bigdata3/192.168.80.232:60020 from
margusja: starting, connections 2
IPC Client (155412783) connection to bigdata3/192.168.80.232:60020 from
margusja:*got response header call_id: 16, totalSize: 68266970 bytes*
IPC Client (155412783) connection to bigdata3/192.168.80.232:60020 from
margusja: closing ipc connection to
bigdata3.webmedia.int/192.168.80.232:60020: Protocol message was too
large. May be malicious. Use CodedInputStream.setSizeLimit() to
increase the size limit.
com.google.protobuf.InvalidProtocolBufferException: Protocol message was
too large. May be malicious. Use CodedInputStream.setSizeLimit() to
increase the size limit.
at
com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
at
com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
at
com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
at
com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.<init>(ClientProtos.java:4086)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result.<init>(ClientProtos.java:4050)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$1.parsePartialFrom(ClientProtos.java:4149)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result$1.parsePartialFrom(ClientProtos.java:4144)
at
com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse.<init>(ClientProtos.java:5951)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse.<init>(ClientProtos.java:5898)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$1.parsePartialFrom(ClientProtos.java:5989)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$1.parsePartialFrom(ClientProtos.java:5984)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$Builder.mergeFrom(ClientProtos.java:6282)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetResponse$Builder.mergeFrom(ClientProtos.java:6171)
at
com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:220)
at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:912)
at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:267)
at
com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:290)
at
com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:926)
at
com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:296)
at
com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:918)
at
org.apache.hadoop.hbase.ipc.RpcClient$Connection.readResponse(RpcClient.java:1099)
at
org.apache.hadoop.hbase.ipc.RpcClient$Connection.run(RpcClient.java:726)
IPC Client (155412783) connection to bigdata3/192.168.80.232:60020 from
margusja: closed
IPC Client (155412783) connection to bigdata3/192.168.80.232:60020 from
margusja: stopped, connections 1
IPC Client (155412783) connection to bigdata2/192.168.80.70:60020 from
margusja: closed
IPC Client (155412783) connection to bigdata2/192.168.80.70:60020 from
margusja: stopped, connections 0
Any hints?
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 480
On 09/12/14 08:45, Margus Roo wrote:
> Hi
>
> I am getting:
> org.apache.hadoop.hbase.client.RpcRetryingCaller@4d6a54b0,
> java.io.IOException: Call to regionserver1/192.168.81.166:60020 failed
> on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> was too large. May be malicious. Use CodedInputStream.setSizeLimit()
> to increase the size limit.
>
> I am using Java API
> ...
> // Get image from table
> HTablePool pool = new HTablePool(getConf(), 1);
> HTableInterface usersTable = pool.getTable(table);
>
> Get g = new Get(Bytes.toBytes(uid));
> Result r = usersTable.get(g);
>
> //g.addFamily(Bytes.toBytes("info"));
> byte[] b =
> r.getValue(Bytes.toBytes(cf),Bytes.toBytes("content"));
> String image = Bytes.toString(b);
>
> FileOutputStream imageOutFile = new
> FileOutputStream(outFileName);
> imageOutFile.write(Base64.decodeBase64(image));
> ...
>
> As I understand then com.google.protobuf is limiting more than 64M
> messages.
> Is that related? https://issues.apache.org/jira/browse/HBASE-11747
>
> Is there workaround?
>