You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Louis Hust <lo...@gmail.com> on 2015/06/30 05:31:11 UTC

Scan got exception

Hi, all

When I scan a table using hbase shell, got the following message:

hbase(main):001:0> scan 'atpco:ttf_record6'
ROW                                              COLUMN+CELL

ERROR: org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
request=scanner_id: 201542113 number_of_rows: 100 close_scanner: false
next_call_seq: 0
at
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
at java.lang.Thread.run(Thread.java:744)


*And the region server got the following error:*

2015-06-30 11:08:11,877 ERROR
[B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
Unexpected throwable object
java.lang.IllegalArgumentException: Negative position
        at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
        at
org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
        at
org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
        at
org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
        at
org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
        at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)

Re: Scan got exception

Posted by Louis Hust <lo...@gmail.com>.
I report a bug: https://issues.apache.org/jira/browse/HBASE-14046

2015-07-09 16:27 GMT+08:00 Louis Hust <lo...@gmail.com>:

> Yes, i will report as a bug
>
> 2015-07-09 0:48 GMT+08:00 Vladimir Rodionov <vl...@gmail.com>:
>
>> Is this issue reproducible? If - yes, then please submit a bug.
>>
>> -Vlad
>>
>> On Wed, Jul 8, 2015 at 2:32 AM, Louis Hust <lo...@gmail.com> wrote:
>>
>> > Any idea?
>> >
>> > 2015-07-01 9:50 GMT+08:00 Louis Hust <lo...@gmail.com>:
>> >
>> > > So the cdh5.2.0 is patched with HBASE-11678 ?
>> > >
>> > > 2015-07-01 6:43 GMT+08:00 Stack <st...@duboce.net>:
>> > >
>> > >> I checked Vladimir and 5.2.0 is the first release with the
>> > >> necessary HBASE-11678 "BucketCache ramCache fills heap after running
>> a
>> > few
>> > >> hours".
>> > >>
>> > >> FYI,
>> > >> Thanks,
>> > >> St.Ack
>> > >>
>> > >> On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <
>> > >> vladrodionov@gmail.com>
>> > >> wrote:
>> > >>
>> > >> > I believe CDH 5.2.0 does not contain all BucketCache critical
>> patches,
>> > >> but
>> > >> > I may be wrong.
>> > >> >
>> > >> > -Vlad
>> > >> >
>> > >> > On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <louis.hust@gmail.com
>> >
>> > >> wrote:
>> > >> >
>> > >> > > <property>
>> > >> > > <name>hbase.bucketcache.size</name>
>> > >> > > <value>800000</value>
>> > >> > > <source>hbase-site.xml</source>
>> > >> > > </property>
>> > >> > > <property>
>> > >> > > <name>hbase.bucketcache.ioengine</name>
>> > >> > > <value>file:/export/hbase/cache.data</value>
>> > >> > > <source>hbase-site.xml</source>
>> > >> > > </property>
>> > >> > > <property>
>> > >> > > <name>hbase.bucketcache.combinedcache.enabled</name>
>> > >> > > <value>false</value>
>> > >> > > <source>hbase-site.xml</source>
>> > >> > > </property>
>> > >> > >
>> > >> > > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
>> > >> > >
>> > >> > > > How do you configure BucketCache ?
>> > >> > > >
>> > >> > > > Thanks
>> > >> > > >
>> > >> > > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <
>> louis.hust@gmail.com
>> > >
>> > >> > > wrote:
>> > >> > > >
>> > >> > > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
>> > >> > > > >
>> > >> > > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <louis.hust@gmail.com
>> >:
>> > >> > > > >
>> > >> > > > > > Hi, all
>> > >> > > > > >
>> > >> > > > > > When I scan a table using hbase shell, got the following
>> > >> message:
>> > >> > > > > >
>> > >> > > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
>> > >> > > > > > ROW
>> COLUMN+CELL
>> > >> > > > > >
>> > >> > > > > > ERROR:
>> > >> > > >
>> org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
>> > >> > > > > > Expected nextCallSeq: 1 But the nextCallSeq got from
>> client:
>> > 0;
>> > >> > > > > > request=scanner_id: 201542113 number_of_rows: 100
>> > close_scanner:
>> > >> > > false
>> > >> > > > > > next_call_seq: 0
>> > >> > > > > > at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
>> > >> > > > > > at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>> > >> > > > > > at
>> > >> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>> > >> > > > > > at
>> > >> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> > >> > > > > > at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
>> > >> > > > > > at
>> > >> > org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
>> > >> > > > > > at java.lang.Thread.run(Thread.java:744)
>> > >> > > > > >
>> > >> > > > > >
>> > >> > > > > > *And the region server got the following error:*
>> > >> > > > > >
>> > >> > > > > > 2015-06-30 11:08:11,877 ERROR
>> > >> > > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020]
>> > >> ipc.RpcServer:
>> > >> > > > > > Unexpected throwable object
>> > >> > > > > > java.lang.IllegalArgumentException: Negative position
>> > >> > > > > >         at
>> > >> > sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
>> > >> > > > > >         at
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> >
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>> > >> > > > > >         at
>> > >> > > > >
>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>> > >> > > > > >         at
>> > >> > > > >
>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> > >> > > > > >
>> > >> > > > >
>> > >> > > >
>> > >> > >
>> > >> >
>> > >>
>> > >
>> > >
>> >
>>
>
>

Re: Scan got exception

Posted by Louis Hust <lo...@gmail.com>.
Yes, i will report as a bug

2015-07-09 0:48 GMT+08:00 Vladimir Rodionov <vl...@gmail.com>:

> Is this issue reproducible? If - yes, then please submit a bug.
>
> -Vlad
>
> On Wed, Jul 8, 2015 at 2:32 AM, Louis Hust <lo...@gmail.com> wrote:
>
> > Any idea?
> >
> > 2015-07-01 9:50 GMT+08:00 Louis Hust <lo...@gmail.com>:
> >
> > > So the cdh5.2.0 is patched with HBASE-11678 ?
> > >
> > > 2015-07-01 6:43 GMT+08:00 Stack <st...@duboce.net>:
> > >
> > >> I checked Vladimir and 5.2.0 is the first release with the
> > >> necessary HBASE-11678 "BucketCache ramCache fills heap after running a
> > few
> > >> hours".
> > >>
> > >> FYI,
> > >> Thanks,
> > >> St.Ack
> > >>
> > >> On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <
> > >> vladrodionov@gmail.com>
> > >> wrote:
> > >>
> > >> > I believe CDH 5.2.0 does not contain all BucketCache critical
> patches,
> > >> but
> > >> > I may be wrong.
> > >> >
> > >> > -Vlad
> > >> >
> > >> > On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com>
> > >> wrote:
> > >> >
> > >> > > <property>
> > >> > > <name>hbase.bucketcache.size</name>
> > >> > > <value>800000</value>
> > >> > > <source>hbase-site.xml</source>
> > >> > > </property>
> > >> > > <property>
> > >> > > <name>hbase.bucketcache.ioengine</name>
> > >> > > <value>file:/export/hbase/cache.data</value>
> > >> > > <source>hbase-site.xml</source>
> > >> > > </property>
> > >> > > <property>
> > >> > > <name>hbase.bucketcache.combinedcache.enabled</name>
> > >> > > <value>false</value>
> > >> > > <source>hbase-site.xml</source>
> > >> > > </property>
> > >> > >
> > >> > > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
> > >> > >
> > >> > > > How do you configure BucketCache ?
> > >> > > >
> > >> > > > Thanks
> > >> > > >
> > >> > > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <
> louis.hust@gmail.com
> > >
> > >> > > wrote:
> > >> > > >
> > >> > > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
> > >> > > > >
> > >> > > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> > >> > > > >
> > >> > > > > > Hi, all
> > >> > > > > >
> > >> > > > > > When I scan a table using hbase shell, got the following
> > >> message:
> > >> > > > > >
> > >> > > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
> > >> > > > > > ROW                                              COLUMN+CELL
> > >> > > > > >
> > >> > > > > > ERROR:
> > >> > > >
> org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > >> > > > > > Expected nextCallSeq: 1 But the nextCallSeq got from client:
> > 0;
> > >> > > > > > request=scanner_id: 201542113 number_of_rows: 100
> > close_scanner:
> > >> > > false
> > >> > > > > > next_call_seq: 0
> > >> > > > > > at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > >> > > > > > at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > >> > > > > > at
> > >> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > >> > > > > > at
> > >> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > >> > > > > > at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > >> > > > > > at
> > >> > org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > >> > > > > > at java.lang.Thread.run(Thread.java:744)
> > >> > > > > >
> > >> > > > > >
> > >> > > > > > *And the region server got the following error:*
> > >> > > > > >
> > >> > > > > > 2015-06-30 11:08:11,877 ERROR
> > >> > > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020]
> > >> ipc.RpcServer:
> > >> > > > > > Unexpected throwable object
> > >> > > > > > java.lang.IllegalArgumentException: Negative position
> > >> > > > > >         at
> > >> > sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > >> > > > > >         at
> > >> > > > >
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > >> > > > > >         at
> > >> > > > >
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> > >
> > >
> >
>

Re: Scan got exception

Posted by ramkrishna vasudevan <ra...@gmail.com>.
+1 to what Vladmir says.  If you can reproduce it that would be great too.

On Wed, Jul 8, 2015 at 10:18 PM, Vladimir Rodionov <vl...@gmail.com>
wrote:

> Is this issue reproducible? If - yes, then please submit a bug.
>
> -Vlad
>
> On Wed, Jul 8, 2015 at 2:32 AM, Louis Hust <lo...@gmail.com> wrote:
>
> > Any idea?
> >
> > 2015-07-01 9:50 GMT+08:00 Louis Hust <lo...@gmail.com>:
> >
> > > So the cdh5.2.0 is patched with HBASE-11678 ?
> > >
> > > 2015-07-01 6:43 GMT+08:00 Stack <st...@duboce.net>:
> > >
> > >> I checked Vladimir and 5.2.0 is the first release with the
> > >> necessary HBASE-11678 "BucketCache ramCache fills heap after running a
> > few
> > >> hours".
> > >>
> > >> FYI,
> > >> Thanks,
> > >> St.Ack
> > >>
> > >> On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <
> > >> vladrodionov@gmail.com>
> > >> wrote:
> > >>
> > >> > I believe CDH 5.2.0 does not contain all BucketCache critical
> patches,
> > >> but
> > >> > I may be wrong.
> > >> >
> > >> > -Vlad
> > >> >
> > >> > On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com>
> > >> wrote:
> > >> >
> > >> > > <property>
> > >> > > <name>hbase.bucketcache.size</name>
> > >> > > <value>800000</value>
> > >> > > <source>hbase-site.xml</source>
> > >> > > </property>
> > >> > > <property>
> > >> > > <name>hbase.bucketcache.ioengine</name>
> > >> > > <value>file:/export/hbase/cache.data</value>
> > >> > > <source>hbase-site.xml</source>
> > >> > > </property>
> > >> > > <property>
> > >> > > <name>hbase.bucketcache.combinedcache.enabled</name>
> > >> > > <value>false</value>
> > >> > > <source>hbase-site.xml</source>
> > >> > > </property>
> > >> > >
> > >> > > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
> > >> > >
> > >> > > > How do you configure BucketCache ?
> > >> > > >
> > >> > > > Thanks
> > >> > > >
> > >> > > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <
> louis.hust@gmail.com
> > >
> > >> > > wrote:
> > >> > > >
> > >> > > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
> > >> > > > >
> > >> > > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> > >> > > > >
> > >> > > > > > Hi, all
> > >> > > > > >
> > >> > > > > > When I scan a table using hbase shell, got the following
> > >> message:
> > >> > > > > >
> > >> > > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
> > >> > > > > > ROW                                              COLUMN+CELL
> > >> > > > > >
> > >> > > > > > ERROR:
> > >> > > >
> org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > >> > > > > > Expected nextCallSeq: 1 But the nextCallSeq got from client:
> > 0;
> > >> > > > > > request=scanner_id: 201542113 number_of_rows: 100
> > close_scanner:
> > >> > > false
> > >> > > > > > next_call_seq: 0
> > >> > > > > > at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > >> > > > > > at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > >> > > > > > at
> > >> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > >> > > > > > at
> > >> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > >> > > > > > at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > >> > > > > > at
> > >> > org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > >> > > > > > at java.lang.Thread.run(Thread.java:744)
> > >> > > > > >
> > >> > > > > >
> > >> > > > > > *And the region server got the following error:*
> > >> > > > > >
> > >> > > > > > 2015-06-30 11:08:11,877 ERROR
> > >> > > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020]
> > >> ipc.RpcServer:
> > >> > > > > > Unexpected throwable object
> > >> > > > > > java.lang.IllegalArgumentException: Negative position
> > >> > > > > >         at
> > >> > sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> > >> > > > > >         at
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > >> > > > > >         at
> > >> > > > >
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > >> > > > > >         at
> > >> > > > >
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >>
> > >
> > >
> >
>

Re: Scan got exception

Posted by Vladimir Rodionov <vl...@gmail.com>.
Is this issue reproducible? If - yes, then please submit a bug.

-Vlad

On Wed, Jul 8, 2015 at 2:32 AM, Louis Hust <lo...@gmail.com> wrote:

> Any idea?
>
> 2015-07-01 9:50 GMT+08:00 Louis Hust <lo...@gmail.com>:
>
> > So the cdh5.2.0 is patched with HBASE-11678 ?
> >
> > 2015-07-01 6:43 GMT+08:00 Stack <st...@duboce.net>:
> >
> >> I checked Vladimir and 5.2.0 is the first release with the
> >> necessary HBASE-11678 "BucketCache ramCache fills heap after running a
> few
> >> hours".
> >>
> >> FYI,
> >> Thanks,
> >> St.Ack
> >>
> >> On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <
> >> vladrodionov@gmail.com>
> >> wrote:
> >>
> >> > I believe CDH 5.2.0 does not contain all BucketCache critical patches,
> >> but
> >> > I may be wrong.
> >> >
> >> > -Vlad
> >> >
> >> > On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com>
> >> wrote:
> >> >
> >> > > <property>
> >> > > <name>hbase.bucketcache.size</name>
> >> > > <value>800000</value>
> >> > > <source>hbase-site.xml</source>
> >> > > </property>
> >> > > <property>
> >> > > <name>hbase.bucketcache.ioengine</name>
> >> > > <value>file:/export/hbase/cache.data</value>
> >> > > <source>hbase-site.xml</source>
> >> > > </property>
> >> > > <property>
> >> > > <name>hbase.bucketcache.combinedcache.enabled</name>
> >> > > <value>false</value>
> >> > > <source>hbase-site.xml</source>
> >> > > </property>
> >> > >
> >> > > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
> >> > >
> >> > > > How do you configure BucketCache ?
> >> > > >
> >> > > > Thanks
> >> > > >
> >> > > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <louis.hust@gmail.com
> >
> >> > > wrote:
> >> > > >
> >> > > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
> >> > > > >
> >> > > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> >> > > > >
> >> > > > > > Hi, all
> >> > > > > >
> >> > > > > > When I scan a table using hbase shell, got the following
> >> message:
> >> > > > > >
> >> > > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
> >> > > > > > ROW                                              COLUMN+CELL
> >> > > > > >
> >> > > > > > ERROR:
> >> > > > org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> >> > > > > > Expected nextCallSeq: 1 But the nextCallSeq got from client:
> 0;
> >> > > > > > request=scanner_id: 201542113 number_of_rows: 100
> close_scanner:
> >> > > false
> >> > > > > > next_call_seq: 0
> >> > > > > > at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> >> > > > > > at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> >> > > > > > at
> >> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> >> > > > > > at
> >> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> >> > > > > > at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> >> > > > > > at
> >> > org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> >> > > > > > at java.lang.Thread.run(Thread.java:744)
> >> > > > > >
> >> > > > > >
> >> > > > > > *And the region server got the following error:*
> >> > > > > >
> >> > > > > > 2015-06-30 11:08:11,877 ERROR
> >> > > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020]
> >> ipc.RpcServer:
> >> > > > > > Unexpected throwable object
> >> > > > > > java.lang.IllegalArgumentException: Negative position
> >> > > > > >         at
> >> > sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> >> > > > > >         at
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> >> > > > > >         at
> >> > > > > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> >> > > > > >         at
> >> > > > > org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> >
> >
>

Re: Scan got exception

Posted by Louis Hust <lo...@gmail.com>.
Any idea?

2015-07-01 9:50 GMT+08:00 Louis Hust <lo...@gmail.com>:

> So the cdh5.2.0 is patched with HBASE-11678 ?
>
> 2015-07-01 6:43 GMT+08:00 Stack <st...@duboce.net>:
>
>> I checked Vladimir and 5.2.0 is the first release with the
>> necessary HBASE-11678 "BucketCache ramCache fills heap after running a few
>> hours".
>>
>> FYI,
>> Thanks,
>> St.Ack
>>
>> On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <
>> vladrodionov@gmail.com>
>> wrote:
>>
>> > I believe CDH 5.2.0 does not contain all BucketCache critical patches,
>> but
>> > I may be wrong.
>> >
>> > -Vlad
>> >
>> > On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com>
>> wrote:
>> >
>> > > <property>
>> > > <name>hbase.bucketcache.size</name>
>> > > <value>800000</value>
>> > > <source>hbase-site.xml</source>
>> > > </property>
>> > > <property>
>> > > <name>hbase.bucketcache.ioengine</name>
>> > > <value>file:/export/hbase/cache.data</value>
>> > > <source>hbase-site.xml</source>
>> > > </property>
>> > > <property>
>> > > <name>hbase.bucketcache.combinedcache.enabled</name>
>> > > <value>false</value>
>> > > <source>hbase-site.xml</source>
>> > > </property>
>> > >
>> > > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
>> > >
>> > > > How do you configure BucketCache ?
>> > > >
>> > > > Thanks
>> > > >
>> > > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <lo...@gmail.com>
>> > > wrote:
>> > > >
>> > > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
>> > > > >
>> > > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
>> > > > >
>> > > > > > Hi, all
>> > > > > >
>> > > > > > When I scan a table using hbase shell, got the following
>> message:
>> > > > > >
>> > > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
>> > > > > > ROW                                              COLUMN+CELL
>> > > > > >
>> > > > > > ERROR:
>> > > > org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
>> > > > > > Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
>> > > > > > request=scanner_id: 201542113 number_of_rows: 100 close_scanner:
>> > > false
>> > > > > > next_call_seq: 0
>> > > > > > at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
>> > > > > > at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>> > > > > > at
>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>> > > > > > at
>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> > > > > > at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
>> > > > > > at
>> > org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
>> > > > > > at java.lang.Thread.run(Thread.java:744)
>> > > > > >
>> > > > > >
>> > > > > > *And the region server got the following error:*
>> > > > > >
>> > > > > > 2015-06-30 11:08:11,877 ERROR
>> > > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020]
>> ipc.RpcServer:
>> > > > > > Unexpected throwable object
>> > > > > > java.lang.IllegalArgumentException: Negative position
>> > > > > >         at
>> > sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
>> > > > > >         at
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>> > > > > >         at
>> > > > > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>> > > > > >         at
>> > > > > org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>>
>
>

Re: Scan got exception

Posted by Louis Hust <lo...@gmail.com>.
So the cdh5.2.0 is patched with HBASE-11678 ?

2015-07-01 6:43 GMT+08:00 Stack <st...@duboce.net>:

> I checked Vladimir and 5.2.0 is the first release with the
> necessary HBASE-11678 "BucketCache ramCache fills heap after running a few
> hours".
>
> FYI,
> Thanks,
> St.Ack
>
> On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <vladrodionov@gmail.com
> >
> wrote:
>
> > I believe CDH 5.2.0 does not contain all BucketCache critical patches,
> but
> > I may be wrong.
> >
> > -Vlad
> >
> > On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com>
> wrote:
> >
> > > <property>
> > > <name>hbase.bucketcache.size</name>
> > > <value>800000</value>
> > > <source>hbase-site.xml</source>
> > > </property>
> > > <property>
> > > <name>hbase.bucketcache.ioengine</name>
> > > <value>file:/export/hbase/cache.data</value>
> > > <source>hbase-site.xml</source>
> > > </property>
> > > <property>
> > > <name>hbase.bucketcache.combinedcache.enabled</name>
> > > <value>false</value>
> > > <source>hbase-site.xml</source>
> > > </property>
> > >
> > > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
> > >
> > > > How do you configure BucketCache ?
> > > >
> > > > Thanks
> > > >
> > > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <lo...@gmail.com>
> > > wrote:
> > > >
> > > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
> > > > >
> > > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> > > > >
> > > > > > Hi, all
> > > > > >
> > > > > > When I scan a table using hbase shell, got the following message:
> > > > > >
> > > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
> > > > > > ROW                                              COLUMN+CELL
> > > > > >
> > > > > > ERROR:
> > > > org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > > > > > Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
> > > > > > request=scanner_id: 201542113 number_of_rows: 100 close_scanner:
> > > false
> > > > > > next_call_seq: 0
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > > > > at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > > > > at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > > > > > at
> > org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > > > > > at java.lang.Thread.run(Thread.java:744)
> > > > > >
> > > > > >
> > > > > > *And the region server got the following error:*
> > > > > >
> > > > > > 2015-06-30 11:08:11,877 ERROR
> > > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
> > > > > > Unexpected throwable object
> > > > > > java.lang.IllegalArgumentException: Negative position
> > > > > >         at
> > sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> > > > > >         at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > > > >         at
> > > > > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > > > >         at
> > > > > org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > > > >
> > > > >
> > > >
> > >
> >
>

Re: Scan got exception

Posted by Stack <st...@duboce.net>.
I checked Vladimir and 5.2.0 is the first release with the
necessary HBASE-11678 "BucketCache ramCache fills heap after running a few
hours".

FYI,
Thanks,
St.Ack

On Tue, Jun 30, 2015 at 3:03 PM, Vladimir Rodionov <vl...@gmail.com>
wrote:

> I believe CDH 5.2.0 does not contain all BucketCache critical patches, but
> I may be wrong.
>
> -Vlad
>
> On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com> wrote:
>
> > <property>
> > <name>hbase.bucketcache.size</name>
> > <value>800000</value>
> > <source>hbase-site.xml</source>
> > </property>
> > <property>
> > <name>hbase.bucketcache.ioengine</name>
> > <value>file:/export/hbase/cache.data</value>
> > <source>hbase-site.xml</source>
> > </property>
> > <property>
> > <name>hbase.bucketcache.combinedcache.enabled</name>
> > <value>false</value>
> > <source>hbase-site.xml</source>
> > </property>
> >
> > 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
> >
> > > How do you configure BucketCache ?
> > >
> > > Thanks
> > >
> > > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <lo...@gmail.com>
> > wrote:
> > >
> > > > BTW, the hbase is hbase0.98.6 CHD5.2.0
> > > >
> > > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> > > >
> > > > > Hi, all
> > > > >
> > > > > When I scan a table using hbase shell, got the following message:
> > > > >
> > > > > hbase(main):001:0> scan 'atpco:ttf_record6'
> > > > > ROW                                              COLUMN+CELL
> > > > >
> > > > > ERROR:
> > > org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > > > > Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
> > > > > request=scanner_id: 201542113 number_of_rows: 100 close_scanner:
> > false
> > > > > next_call_seq: 0
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > > > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > > > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > > > > at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > > > > at java.lang.Thread.run(Thread.java:744)
> > > > >
> > > > >
> > > > > *And the region server got the following error:*
> > > > >
> > > > > 2015-06-30 11:08:11,877 ERROR
> > > > > [B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
> > > > > Unexpected throwable object
> > > > > java.lang.IllegalArgumentException: Negative position
> > > > >         at
> sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> > > > >         at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > > >         at
> > > > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > > >         at
> > > > org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > > >
> > > >
> > >
> >
>

Re: Scan got exception

Posted by Vladimir Rodionov <vl...@gmail.com>.
I believe CDH 5.2.0 does not contain all BucketCache critical patches, but
I may be wrong.

-Vlad

On Tue, Jun 30, 2015 at 12:25 AM, Louis Hust <lo...@gmail.com> wrote:

> <property>
> <name>hbase.bucketcache.size</name>
> <value>800000</value>
> <source>hbase-site.xml</source>
> </property>
> <property>
> <name>hbase.bucketcache.ioengine</name>
> <value>file:/export/hbase/cache.data</value>
> <source>hbase-site.xml</source>
> </property>
> <property>
> <name>hbase.bucketcache.combinedcache.enabled</name>
> <value>false</value>
> <source>hbase-site.xml</source>
> </property>
>
> 2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:
>
> > How do you configure BucketCache ?
> >
> > Thanks
> >
> > On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <lo...@gmail.com>
> wrote:
> >
> > > BTW, the hbase is hbase0.98.6 CHD5.2.0
> > >
> > > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> > >
> > > > Hi, all
> > > >
> > > > When I scan a table using hbase shell, got the following message:
> > > >
> > > > hbase(main):001:0> scan 'atpco:ttf_record6'
> > > > ROW                                              COLUMN+CELL
> > > >
> > > > ERROR:
> > org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > > > Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
> > > > request=scanner_id: 201542113 number_of_rows: 100 close_scanner:
> false
> > > > next_call_seq: 0
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > > > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > > > at java.lang.Thread.run(Thread.java:744)
> > > >
> > > >
> > > > *And the region server got the following error:*
> > > >
> > > > 2015-06-30 11:08:11,877 ERROR
> > > > [B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
> > > > Unexpected throwable object
> > > > java.lang.IllegalArgumentException: Negative position
> > > >         at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > >         at
> > > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > >         at
> > > org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > >
> > >
> >
>

Re: Scan got exception

Posted by Louis Hust <lo...@gmail.com>.
<property>
<name>hbase.bucketcache.size</name>
<value>800000</value>
<source>hbase-site.xml</source>
</property>
<property>
<name>hbase.bucketcache.ioengine</name>
<value>file:/export/hbase/cache.data</value>
<source>hbase-site.xml</source>
</property>
<property>
<name>hbase.bucketcache.combinedcache.enabled</name>
<value>false</value>
<source>hbase-site.xml</source>
</property>

2015-06-30 12:22 GMT+08:00 Ted Yu <yu...@gmail.com>:

> How do you configure BucketCache ?
>
> Thanks
>
> On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <lo...@gmail.com> wrote:
>
> > BTW, the hbase is hbase0.98.6 CHD5.2.0
> >
> > 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
> >
> > > Hi, all
> > >
> > > When I scan a table using hbase shell, got the following message:
> > >
> > > hbase(main):001:0> scan 'atpco:ttf_record6'
> > > ROW                                              COLUMN+CELL
> > >
> > > ERROR:
> org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > > Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
> > > request=scanner_id: 201542113 number_of_rows: 100 close_scanner: false
> > > next_call_seq: 0
> > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > > at
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > > at java.lang.Thread.run(Thread.java:744)
> > >
> > >
> > > *And the region server got the following error:*
> > >
> > > 2015-06-30 11:08:11,877 ERROR
> > > [B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
> > > Unexpected throwable object
> > > java.lang.IllegalArgumentException: Negative position
> > >         at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > >         at
> > org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > >         at
> > org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > >
> >
>

Re: Scan got exception

Posted by Ted Yu <yu...@gmail.com>.
How do you configure BucketCache ?

Thanks

On Mon, Jun 29, 2015 at 8:35 PM, Louis Hust <lo...@gmail.com> wrote:

> BTW, the hbase is hbase0.98.6 CHD5.2.0
>
> 2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:
>
> > Hi, all
> >
> > When I scan a table using hbase shell, got the following message:
> >
> > hbase(main):001:0> scan 'atpco:ttf_record6'
> > ROW                                              COLUMN+CELL
> >
> > ERROR: org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> > Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
> > request=scanner_id: 201542113 number_of_rows: 100 close_scanner: false
> > next_call_seq: 0
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> > at
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> > at
> >
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> > at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> > at java.lang.Thread.run(Thread.java:744)
> >
> >
> > *And the region server got the following error:*
> >
> > 2015-06-30 11:08:11,877 ERROR
> > [B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
> > Unexpected throwable object
> > java.lang.IllegalArgumentException: Negative position
> >         at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
> >         at
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> >         at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> >         at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> >
>

Re: Scan got exception

Posted by Louis Hust <lo...@gmail.com>.
BTW, the hbase is hbase0.98.6 CHD5.2.0

2015-06-30 11:31 GMT+08:00 Louis Hust <lo...@gmail.com>:

> Hi, all
>
> When I scan a table using hbase shell, got the following message:
>
> hbase(main):001:0> scan 'atpco:ttf_record6'
> ROW                                              COLUMN+CELL
>
> ERROR: org.apache.hadoop.hbase.exceptions.OutOfOrderScannerNextException:
> Expected nextCallSeq: 1 But the nextCallSeq got from client: 0;
> request=scanner_id: 201542113 number_of_rows: 100 close_scanner: false
> next_call_seq: 0
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3193)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
> at java.lang.Thread.run(Thread.java:744)
>
>
> *And the region server got the following error:*
>
> 2015-06-30 11:08:11,877 ERROR
> [B.defaultRpcServer.handler=27,queue=0,port=60020] ipc.RpcServer:
> Unexpected throwable object
> java.lang.IllegalArgumentException: Negative position
>         at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:675)
>         at
> org.apache.hadoop.hbase.io.hfile.bucket.FileIOEngine.read(FileIOEngine.java:87)
>         at
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getBlock(BucketCache.java:406)
>         at
> org.apache.hadoop.hbase.io.hfile.LruBlockCache.getBlock(LruBlockCache.java:389)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:359)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.readNextDataBlock(HFileReaderV2.java:635)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:749)
>         at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:136)
>         at
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
>         at
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:507)
>         at
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:3900)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:3980)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3858)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:3849)
>         at
> org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3245)
>         at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29587)
>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:108)
>