You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Bai Shen <ba...@gmail.com> on 2012/09/19 16:27:07 UTC

HBase ChecksumException IllegalArgumentException

I'm running Nutch 2 using HBase as my backend in local mode.  Everything
seems to be working correctly except when I run the readdb method.  When I
run readdb, I get the following stack trace.

2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
attempts=10, exceptions:
Wed Sep 19 10:15:07 EDT 2012,
org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
java.io.IOException: java.io.IOException: Could not iterate
StoreFileScanner[HFileScanner for reader
reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
[cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
[cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
[cacheCompressed=false], firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put, avgKeyLen=161,
avgValueLen=14, entries=17405, length=3238861,
cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11]
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
        at
org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
        at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
at 1378304 exp: -89200966 got: -2503767
        at
org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
        at
org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
        at org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
        at
org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
        at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
        at
org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
        at org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
        at
org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
        at
org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
        ... 12 more

Wed Sep 19 10:15:08 EDT 2012,
org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
        at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
Caused by: java.lang.IllegalArgumentException
        at java.nio.Buffer.position(Buffer.java:236)
        at
org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
        at
org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
        ... 5 more


I've run "hbase hbck" and returns the following.

Summary:
  -ROOT- is okay.
    Number of regions: 1
    Deployed on:  node9-0,53595,1348062612459
  .META. is okay.
    Number of regions: 1
    Deployed on:  node9-0,53595,1348062612459
  webpage is okay.
    Number of regions: 18
    Deployed on:  node9-0,53595,1348062612459
0 inconsistencies detected.


Any suggestions on what's wrong and how to fix it?

Thanks.

Re: HBase ChecksumException IllegalArgumentException

Posted by Stack <st...@duboce.net>.
On Mon, Sep 24, 2012 at 6:17 AM, Bai Shen <ba...@gmail.com> wrote:
> I'm still getting checksum errors for some reason.  Things run fine and
> then start erroring out with checksum errors.  Any ideas for what I can
> look at to figure out why I'm getting the checksum errors?
>

You say you are running in local mode?  You are using the local
filesystem?  Would suggest you move to hdfs.  Local fs needs work.

St.Ack

Re: HBase ChecksumException IllegalArgumentException

Posted by Bai Shen <ba...@gmail.com>.
I'm still getting checksum errors for some reason.  Things run fine and
then start erroring out with checksum errors.  Any ideas for what I can
look at to figure out why I'm getting the checksum errors?

Thanks.

On Wed, Sep 19, 2012 at 1:10 PM, Bai Shen <ba...@gmail.com> wrote:

> I just ran fsck and this is the result.
>
> [root@node9-0 ~]# fsck -f /dev/sdb1
> fsck from util-linux-ng 2.17.2
> e2fsck 1.41.12 (17-May-2010)
> Pass 1: Checking inodes, blocks, and sizes
> Pass 2: Checking directory structure
> Pass 3: Checking directory connectivity
> Pass 4: Checking reference counts
> Pass 5: Checking group summary information
> /dev/sdb1: 800/122101760 files (34.6% non-contiguous), 10295560/488378368
> blocks
>
>
>
> On Wed, Sep 19, 2012 at 12:15 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Did you notice the 'file:' scheme for your file ?
>>
>> Have you run fsck to see if your hdfs is healthy ?
>>
>> Cheers
>>
>> On Wed, Sep 19, 2012 at 8:32 AM, Bai Shen <ba...@gmail.com>
>> wrote:
>>
>> > It's the one from the cloudera repo.  0.92.1
>> >
>> > On Wed, Sep 19, 2012 at 10:48 AM, Ted Yu <yu...@gmail.com> wrote:
>> >
>> > > Can you tell us which HBase version you are using ?
>> > >
>> > > On Wed, Sep 19, 2012 at 7:27 AM, Bai Shen <ba...@gmail.com>
>> > wrote:
>> > >
>> > > > I'm running Nutch 2 using HBase as my backend in local mode.
>> >  Everything
>> > > > seems to be working correctly except when I run the readdb method.
>> >  When
>> > > I
>> > > > run readdb, I get the following stack trace.
>> > > >
>> > > > 2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
>> > > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed
>> after
>> > > > attempts=10, exceptions:
>> > > > Wed Sep 19 10:15:07 EDT 2012,
>> > > > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
>> > > > java.io.IOException: java.io.IOException: Could not iterate
>> > > > StoreFileScanner[HFileScanner for reader
>> > > >
>> > > >
>> > >
>> >
>> reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
>> > > > compression=none, cacheConf=CacheConfig:enabled
>> [cacheDataOnRead=true]
>> > > > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
>> > > > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
>> > > > [cacheCompressed=false],
>> firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
>> > > > http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
>> > > > lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
>> > > > http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put,
>> > > avgKeyLen=161,
>> > > > avgValueLen=14, entries=17405, length=3238861,
>> > > >
>> > > >
>> > >
>> >
>> cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
>> > > >
>> > >
>> >
>> http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11
>> > > > ]
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
>> > > >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown
>> Source)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >         at java.lang.reflect.Method.invoke(Method.java:601)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
>> > > > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
>> > > >
>> > > >
>> > >
>> >
>> file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
>> > > > at 1378304 exp: -89200966 got: -2503767
>> > > >         at
>> > > >
>> org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
>> > > >         at
>> > > > org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
>> > > >         at
>> > > > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
>> > > >         at
>> > > > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
>> > > >         at
>> > > >
>> org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
>> > > >         at
>> > > > org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
>> > > >         at
>> > > >
>> org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
>> > > >         ... 12 more
>> > > >
>> > > > Wed Sep 19 10:15:08 EDT 2012,
>> > > > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
>> > > > java.io.IOException: java.io.IOException:
>> > > > java.lang.IllegalArgumentException
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
>> > > >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown
>> Source)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >         at java.lang.reflect.Method.invoke(Method.java:601)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
>> > > > Caused by: java.lang.IllegalArgumentException
>> > > >         at java.nio.Buffer.position(Buffer.java:236)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
>> > > >         ... 5 more
>> > > >
>> > > >
>> > > > I've run "hbase hbck" and returns the following.
>> > > >
>> > > > Summary:
>> > > >   -ROOT- is okay.
>> > > >     Number of regions: 1
>> > > >     Deployed on:  node9-0,53595,1348062612459
>> > > >   .META. is okay.
>> > > >     Number of regions: 1
>> > > >     Deployed on:  node9-0,53595,1348062612459
>> > > >   webpage is okay.
>> > > >     Number of regions: 18
>> > > >     Deployed on:  node9-0,53595,1348062612459
>> > > > 0 inconsistencies detected.
>> > > >
>> > > >
>> > > > Any suggestions on what's wrong and how to fix it?
>> > > >
>> > > > Thanks.
>> > > >
>> > >
>> >
>>
>
>

Re: HBase ChecksumException IllegalArgumentException

Posted by Bai Shen <ba...@gmail.com>.
I just ran fsck and this is the result.

[root@node9-0 ~]# fsck -f /dev/sdb1
fsck from util-linux-ng 2.17.2
e2fsck 1.41.12 (17-May-2010)
Pass 1: Checking inodes, blocks, and sizes
Pass 2: Checking directory structure
Pass 3: Checking directory connectivity
Pass 4: Checking reference counts
Pass 5: Checking group summary information
/dev/sdb1: 800/122101760 files (34.6% non-contiguous), 10295560/488378368
blocks


On Wed, Sep 19, 2012 at 12:15 PM, Ted Yu <yu...@gmail.com> wrote:

> Did you notice the 'file:' scheme for your file ?
>
> Have you run fsck to see if your hdfs is healthy ?
>
> Cheers
>
> On Wed, Sep 19, 2012 at 8:32 AM, Bai Shen <ba...@gmail.com> wrote:
>
> > It's the one from the cloudera repo.  0.92.1
> >
> > On Wed, Sep 19, 2012 at 10:48 AM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > Can you tell us which HBase version you are using ?
> > >
> > > On Wed, Sep 19, 2012 at 7:27 AM, Bai Shen <ba...@gmail.com>
> > wrote:
> > >
> > > > I'm running Nutch 2 using HBase as my backend in local mode.
> >  Everything
> > > > seems to be working correctly except when I run the readdb method.
> >  When
> > > I
> > > > run readdb, I get the following stack trace.
> > > >
> > > > 2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
> > > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed
> after
> > > > attempts=10, exceptions:
> > > > Wed Sep 19 10:15:07 EDT 2012,
> > > > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > > > java.io.IOException: java.io.IOException: Could not iterate
> > > > StoreFileScanner[HFileScanner for reader
> > > >
> > > >
> > >
> >
> reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
> > > > compression=none, cacheConf=CacheConfig:enabled
> [cacheDataOnRead=true]
> > > > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> > > > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> > > > [cacheCompressed=false],
> firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
> > > > http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
> > > > lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
> > > > http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put,
> > > avgKeyLen=161,
> > > > avgValueLen=14, entries=17405, length=3238861,
> > > >
> > > >
> > >
> >
> cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
> > > >
> > >
> >
> http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11
> > > > ]
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> > > >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown
> Source)
> > > >         at
> > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >         at java.lang.reflect.Method.invoke(Method.java:601)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > > > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
> > > >
> > > >
> > >
> >
> file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
> > > > at 1378304 exp: -89200966 got: -2503767
> > > >         at
> > > >
> org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> > > >         at
> > > > org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
> > > >         at
> > > > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
> > > >         at
> > > > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
> > > >         at
> > > >
> org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
> > > >         at
> > > > org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
> > > >         at
> > > >
> org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> > > >         ... 12 more
> > > >
> > > > Wed Sep 19 10:15:08 EDT 2012,
> > > > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > > > java.io.IOException: java.io.IOException:
> > > > java.lang.IllegalArgumentException
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
> > > >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown
> Source)
> > > >         at
> > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >         at java.lang.reflect.Method.invoke(Method.java:601)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > > > Caused by: java.lang.IllegalArgumentException
> > > >         at java.nio.Buffer.position(Buffer.java:236)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> > > >         ... 5 more
> > > >
> > > >
> > > > I've run "hbase hbck" and returns the following.
> > > >
> > > > Summary:
> > > >   -ROOT- is okay.
> > > >     Number of regions: 1
> > > >     Deployed on:  node9-0,53595,1348062612459
> > > >   .META. is okay.
> > > >     Number of regions: 1
> > > >     Deployed on:  node9-0,53595,1348062612459
> > > >   webpage is okay.
> > > >     Number of regions: 18
> > > >     Deployed on:  node9-0,53595,1348062612459
> > > > 0 inconsistencies detected.
> > > >
> > > >
> > > > Any suggestions on what's wrong and how to fix it?
> > > >
> > > > Thanks.
> > > >
> > >
> >
>

Re: HBase ChecksumException IllegalArgumentException

Posted by Ted Yu <yu...@gmail.com>.
Did you notice the 'file:' scheme for your file ?

Have you run fsck to see if your hdfs is healthy ?

Cheers

On Wed, Sep 19, 2012 at 8:32 AM, Bai Shen <ba...@gmail.com> wrote:

> It's the one from the cloudera repo.  0.92.1
>
> On Wed, Sep 19, 2012 at 10:48 AM, Ted Yu <yu...@gmail.com> wrote:
>
> > Can you tell us which HBase version you are using ?
> >
> > On Wed, Sep 19, 2012 at 7:27 AM, Bai Shen <ba...@gmail.com>
> wrote:
> >
> > > I'm running Nutch 2 using HBase as my backend in local mode.
>  Everything
> > > seems to be working correctly except when I run the readdb method.
>  When
> > I
> > > run readdb, I get the following stack trace.
> > >
> > > 2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
> > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> > > attempts=10, exceptions:
> > > Wed Sep 19 10:15:07 EDT 2012,
> > > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > > java.io.IOException: java.io.IOException: Could not iterate
> > > StoreFileScanner[HFileScanner for reader
> > >
> > >
> >
> reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
> > > compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> > > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> > > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> > > [cacheCompressed=false], firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
> > > http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
> > > lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
> > > http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put,
> > avgKeyLen=161,
> > > avgValueLen=14, entries=17405, length=3238861,
> > >
> > >
> >
> cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
> > >
> >
> http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11
> > > ]
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> > >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
> > >         at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.lang.reflect.Method.invoke(Method.java:601)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
> > >
> > >
> >
> file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
> > > at 1378304 exp: -89200966 got: -2503767
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
> > >         at
> > > org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> > >         ... 12 more
> > >
> > > Wed Sep 19 10:15:08 EDT 2012,
> > > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > > java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
> > >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
> > >         at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.lang.reflect.Method.invoke(Method.java:601)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > > Caused by: java.lang.IllegalArgumentException
> > >         at java.nio.Buffer.position(Buffer.java:236)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> > >         ... 5 more
> > >
> > >
> > > I've run "hbase hbck" and returns the following.
> > >
> > > Summary:
> > >   -ROOT- is okay.
> > >     Number of regions: 1
> > >     Deployed on:  node9-0,53595,1348062612459
> > >   .META. is okay.
> > >     Number of regions: 1
> > >     Deployed on:  node9-0,53595,1348062612459
> > >   webpage is okay.
> > >     Number of regions: 18
> > >     Deployed on:  node9-0,53595,1348062612459
> > > 0 inconsistencies detected.
> > >
> > >
> > > Any suggestions on what's wrong and how to fix it?
> > >
> > > Thanks.
> > >
> >
>

Re: HBase ChecksumException IllegalArgumentException

Posted by Bai Shen <ba...@gmail.com>.
It's the one from the cloudera repo.  0.92.1

On Wed, Sep 19, 2012 at 10:48 AM, Ted Yu <yu...@gmail.com> wrote:

> Can you tell us which HBase version you are using ?
>
> On Wed, Sep 19, 2012 at 7:27 AM, Bai Shen <ba...@gmail.com> wrote:
>
> > I'm running Nutch 2 using HBase as my backend in local mode.  Everything
> > seems to be working correctly except when I run the readdb method.  When
> I
> > run readdb, I get the following stack trace.
> >
> > 2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
> > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> > attempts=10, exceptions:
> > Wed Sep 19 10:15:07 EDT 2012,
> > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > java.io.IOException: java.io.IOException: Could not iterate
> > StoreFileScanner[HFileScanner for reader
> >
> >
> reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
> > compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> > [cacheCompressed=false], firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
> > http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
> > lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
> > http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put,
> avgKeyLen=161,
> > avgValueLen=14, entries=17405, length=3238861,
> >
> >
> cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
> >
> http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11
> > ]
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:601)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> >         at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
> >
> >
> file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
> > at 1378304 exp: -89200966 got: -2503767
> >         at
> > org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
> >         at
> >
> >
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
> >         at
> > org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
> >         at
> >
> >
> org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> >         ... 12 more
> >
> > Wed Sep 19 10:15:08 EDT 2012,
> > org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> > java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
> >         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:601)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> >         at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> > Caused by: java.lang.IllegalArgumentException
> >         at java.nio.Buffer.position(Buffer.java:236)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
> >         ... 5 more
> >
> >
> > I've run "hbase hbck" and returns the following.
> >
> > Summary:
> >   -ROOT- is okay.
> >     Number of regions: 1
> >     Deployed on:  node9-0,53595,1348062612459
> >   .META. is okay.
> >     Number of regions: 1
> >     Deployed on:  node9-0,53595,1348062612459
> >   webpage is okay.
> >     Number of regions: 18
> >     Deployed on:  node9-0,53595,1348062612459
> > 0 inconsistencies detected.
> >
> >
> > Any suggestions on what's wrong and how to fix it?
> >
> > Thanks.
> >
>

Re: HBase ChecksumException IllegalArgumentException

Posted by Ted Yu <yu...@gmail.com>.
Can you tell us which HBase version you are using ?

On Wed, Sep 19, 2012 at 7:27 AM, Bai Shen <ba...@gmail.com> wrote:

> I'm running Nutch 2 using HBase as my backend in local mode.  Everything
> seems to be working correctly except when I run the readdb method.  When I
> run readdb, I get the following stack trace.
>
> 2012-09-19 10:15:46,485 WARN  mapred.LocalJobRunner - job_local_0001
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> attempts=10, exceptions:
> Wed Sep 19 10:15:07 EDT 2012,
> org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> java.io.IOException: java.io.IOException: Could not iterate
> StoreFileScanner[HFileScanner for reader
>
> reader=file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034,
> compression=none, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> [cacheCompressed=false], firstKey=edu.ndu.www:http/aa/catalogs.cfm/ol:
> http://www.ndu.edu/aa/catalogs.cfm/1348036803793/Put,
> lastKey=edu.ucla.anderson.www:http/mba-admissions.xml/ol:
> http://www.anderson.ucla.edu/x40700.xml/1348036827378/Put, avgKeyLen=161,
> avgValueLen=14, entries=17405, length=3238861,
>
> cur=edu.nps.www:http/About/News/NPS-Crushes-CubeSats-for-DARPA-Challenge.html/ol:
> http://www.nps.edu/Technology/HPC/ContactHPC.html/1348036805774/Put/vlen=11
> ]
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:104)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:289)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
>         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:601)
>         at
>
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
>
> file:/data1/hbase/root/webpage/583a33aae4c4003021da635aba2f70c4/ol/d09b3cd6eb0b478cbd6f64d420e42034
> at 1378304 exp: -89200966 got: -2503767
>         at
> org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:320)
>         at
>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:276)
>         at
> org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:211)
>         at
> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:229)
>         at
> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:193)
>         at
> org.apache.hadoop.fs.FSInputChecker.readFully(FSInputChecker.java:431)
>         at
> org.apache.hadoop.fs.FSInputChecker.seek(FSInputChecker.java:412)
>         at
> org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:47)
>         at
>
> org.apache.hadoop.fs.ChecksumFileSystem$FSDataBoundedInputStream.seek(ChecksumFileSystem.java:318)
>         at
>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1047)
>         at
>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1318)
>         at
>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:266)
>         at
>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.readNextDataBlock(HFileReaderV2.java:452)
>         at
>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:416)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
>         ... 12 more
>
> Wed Sep 19 10:15:08 EDT 2012,
> org.apache.hadoop.hbase.client.ScannerCallable@345ac4dc,
> java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1084)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:1073)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2186)
>         at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:601)
>         at
>
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1336)
> Caused by: java.lang.IllegalArgumentException
>         at java.nio.Buffer.position(Buffer.java:236)
>         at
>
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.next(HFileReaderV2.java:395)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:99)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:106)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:326)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:138)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:2978)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2925)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:2942)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:2159)
>         ... 5 more
>
>
> I've run "hbase hbck" and returns the following.
>
> Summary:
>   -ROOT- is okay.
>     Number of regions: 1
>     Deployed on:  node9-0,53595,1348062612459
>   .META. is okay.
>     Number of regions: 1
>     Deployed on:  node9-0,53595,1348062612459
>   webpage is okay.
>     Number of regions: 18
>     Deployed on:  node9-0,53595,1348062612459
> 0 inconsistencies detected.
>
>
> Any suggestions on what's wrong and how to fix it?
>
> Thanks.
>