You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Stanislav Orlenko <or...@gmail.com> on 2012/12/16 01:13:40 UTC

checksum exception

Hello
We use HBase 0.90.6 and have the problem:

org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact
region server marv.site:56463 for region
webpage,pl.allegro:http/stare-rozdzielacze-i2839591826.html,1355478823557.80e5ed6af22f12d45eabae89537f8602.,
row 'pl.allegro:http/stare-rozdzielacze-i2839591826.html', but failed after
10 attempts.
Exceptions:
java.io.IOException: java.io.IOException: Could not iterate
StoreFileScanner[HFileScanner for reader
reader=file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358,
compression=none, inMemory=false,
firstKey=pl.allegro:http/stare-rozdzielacze-i2839591826.html/f:bas/1355311168483/Put,
lastKey=pl.gumtree.lodz:http/p-TermsAndConditions/f:ts/1355360972551/Put,
avgKeyLen=83, avgValueLen=2880, entries=64207, length=191013462,
cur=pl.allegro:http/swiat-mebli-komoda-srebrna-61980-k14-i2787252255.html/f:cnt/1355480654644/Put/vlen=62169]
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:89)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
        at
org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
        at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358
at 28099584
        at
org.apache.hadoop.fs.FSInputChecker.verifySum(FSInputChecker.java:277)
        at
org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:241)
        at
org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
        at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at
org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(BoundedRangeFileInputStream.java:105)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1276)
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
        ... 12 more

java.io.IOException: java.io.IOException: java.lang.IllegalArgumentException
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:997)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:986)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1887)
        at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
Caused by: java.lang.IllegalArgumentException
        at java.nio.Buffer.position(Buffer.java:218)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1266)
        at
org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
        at
org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
        at
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
        ... 5 more

We do not use special configuration. Only hbase.rootdir and
hbase.zookeeper.property.dataDir are specified in hbase-site.
What can be a reason for such behavior?

Thanks in advance

Re: checksum exception

Posted by Andrew Purtell <ap...@apache.org>.
>
file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358

Are you running HBase on HDFS?

I've seen checksum errors like this using Hadoop's LocalFS
(file:///foo/bar/) before. There isn't anything HBase can do here, this is
a problem that is in the (old) Hadoop libraries.

Also, the hadoop-core-0.20-append-r1056497 version of Hadoop was rather
specific to Facebook and a select few deployments at a certain point in
time in HBase history. Pretty far back now. I doubt anyone in Hadoop is
willing or able to support this today. Likewise, HBase 0.90.x is two
releases back from our current release. You'll find the community is much
better able to support you if using something more up to date. These may be
the versions that ship with that release of Nutch+Gora you are using, but
maybe there is a more recent option?

On Tue, Dec 18, 2012 at 4:09 PM, Stanislav Orlenko
<or...@gmail.com>wrote:

> In logs also I see exceptions like:
>
> 2012-12-19 01:40:45,320 ERROR
> org.apache.hadoop.hbase.regionserver.HRegionServer:
> java.lang.IllegalArgumentException: offset (65577) + length (2) exceed the
> capacity of the array: 65577
>         at
>
> org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset(Bytes.java:506)
>         at org.apache.hadoop.hbase.util.Bytes.toShort(Bytes.java:728)
>         at org.apache.hadoop.hbase.util.Bytes.toShort(Bytes.java:714)
>         at org.apache.hadoop.hbase.KeyValue.getRowLength(KeyValue.java:733)
>         at org.apache.hadoop.hbase.KeyValue.getRow(KeyValue.java:894)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.peekRow(HRegion.java:2523)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2454)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
>         at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
>
> I am not sure about hadoop version, in libs directory I see
> hadoop-core-0.20-append-r1056497.jar
>
> We use hbase as a backend for Gora in Nutch 2. Gora 0.2 uses HBase 0.90.X.
> That is why we use hbase 0.90.6. By the way in nutch's lib directory I see
> hadoop-core-1.0.3.jar
>
> Strange thing. After reboot it works without the exception for some time.
> Although after some period of time the checksum exception appears again.
> For example after previous reboot it has worked for about 24 hours, but now
> I see this exception again
>
>
> On Sun, Dec 16, 2012 at 2:44 AM, Ted Yu <yu...@gmail.com> wrote:
>
> > The exception came from FSInputChecker.verifySum().
> >
> > Can you check Namenode log around the time the error happened to find
> more
> > information ?
> > What hadoop version do you use ?
> >
> > BTW 0.90.6 is really old. I suggest upgrading to 0.92 or 0.94
> >
> > Cheers
> >
> > On Sat, Dec 15, 2012 at 4:13 PM, Stanislav Orlenko
> > <or...@gmail.com>wrote:
> >
> > > Hello
> > > We use HBase 0.90.6 and have the problem:
> > >
> > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
> > contact
> > > region server marv.site:56463 for region
> > >
> > >
> >
> webpage,pl.allegro:http/stare-rozdzielacze-i2839591826.html,1355478823557.80e5ed6af22f12d45eabae89537f8602.,
> > > row 'pl.allegro:http/stare-rozdzielacze-i2839591826.html', but failed
> > after
> > > 10 attempts.
> > > Exceptions:
> > > java.io.IOException: java.io.IOException: Could not iterate
> > > StoreFileScanner[HFileScanner for reader
> > >
> > >
> >
> reader=file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358,
> > > compression=none, inMemory=false,
> > >
> > >
> >
> firstKey=pl.allegro:http/stare-rozdzielacze-i2839591826.html/f:bas/1355311168483/Put,
> > >
> lastKey=pl.gumtree.lodz:http/p-TermsAndConditions/f:ts/1355360972551/Put,
> > > avgKeyLen=83, avgValueLen=2880, entries=64207, length=191013462,
> > >
> > >
> >
> cur=pl.allegro:http/swiat-mebli-komoda-srebrna-61980-k14-i2787252255.html/f:cnt/1355480654644/Put/vlen=62169]
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:89)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
> > >         at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
> > >         at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >         at java.lang.reflect.Method.invoke(Method.java:597)
> > >         at
> > > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
> > > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
> > >
> > >
> >
> file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358
> > > at 28099584
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.verifySum(FSInputChecker.java:277)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:241)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
> > >         at
> > > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
> > >         at java.io.DataInputStream.read(DataInputStream.java:132)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(BoundedRangeFileInputStream.java:105)
> > >         at
> > java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> > >         at
> java.io.BufferedInputStream.read(BufferedInputStream.java:317)
> > >         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
> > >         at
> > >
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094)
> > >         at
> > >
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1276)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
> > >         ... 12 more
> > >
> > > java.io.IOException: java.io.IOException:
> > > java.lang.IllegalArgumentException
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:997)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:986)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1887)
> > >         at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
> > >         at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >         at java.lang.reflect.Method.invoke(Method.java:597)
> > >         at
> > > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
> > > Caused by: java.lang.IllegalArgumentException
> > >         at java.nio.Buffer.position(Buffer.java:218)
> > >         at
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1266)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
> > >         ... 5 more
> > >
> > > We do not use special configuration. Only hbase.rootdir and
> > > hbase.zookeeper.property.dataDir are specified in hbase-site.
> > > What can be a reason for such behavior?
> > >
> > > Thanks in advance
> > >
> >
>



-- 
Best regards,

   - Andy

Problems worthy of attack prove their worth by hitting back. - Piet Hein
(via Tom White)

Re: checksum exception

Posted by Stanislav Orlenko <or...@gmail.com>.
In logs also I see exceptions like:

2012-12-19 01:40:45,320 ERROR
org.apache.hadoop.hbase.regionserver.HRegionServer:
java.lang.IllegalArgumentException: offset (65577) + length (2) exceed the
capacity of the array: 65577
        at
org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset(Bytes.java:506)
        at org.apache.hadoop.hbase.util.Bytes.toShort(Bytes.java:728)
        at org.apache.hadoop.hbase.util.Bytes.toShort(Bytes.java:714)
        at org.apache.hadoop.hbase.KeyValue.getRowLength(KeyValue.java:733)
        at org.apache.hadoop.hbase.KeyValue.getRow(KeyValue.java:894)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.peekRow(HRegion.java:2523)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2454)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
        at
org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
        at
org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
        at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)

I am not sure about hadoop version, in libs directory I see
hadoop-core-0.20-append-r1056497.jar

We use hbase as a backend for Gora in Nutch 2. Gora 0.2 uses HBase 0.90.X.
That is why we use hbase 0.90.6. By the way in nutch's lib directory I see
hadoop-core-1.0.3.jar

Strange thing. After reboot it works without the exception for some time.
Although after some period of time the checksum exception appears again.
For example after previous reboot it has worked for about 24 hours, but now
I see this exception again


On Sun, Dec 16, 2012 at 2:44 AM, Ted Yu <yu...@gmail.com> wrote:

> The exception came from FSInputChecker.verifySum().
>
> Can you check Namenode log around the time the error happened to find more
> information ?
> What hadoop version do you use ?
>
> BTW 0.90.6 is really old. I suggest upgrading to 0.92 or 0.94
>
> Cheers
>
> On Sat, Dec 15, 2012 at 4:13 PM, Stanislav Orlenko
> <or...@gmail.com>wrote:
>
> > Hello
> > We use HBase 0.90.6 and have the problem:
> >
> > org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
> contact
> > region server marv.site:56463 for region
> >
> >
> webpage,pl.allegro:http/stare-rozdzielacze-i2839591826.html,1355478823557.80e5ed6af22f12d45eabae89537f8602.,
> > row 'pl.allegro:http/stare-rozdzielacze-i2839591826.html', but failed
> after
> > 10 attempts.
> > Exceptions:
> > java.io.IOException: java.io.IOException: Could not iterate
> > StoreFileScanner[HFileScanner for reader
> >
> >
> reader=file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358,
> > compression=none, inMemory=false,
> >
> >
> firstKey=pl.allegro:http/stare-rozdzielacze-i2839591826.html/f:bas/1355311168483/Put,
> > lastKey=pl.gumtree.lodz:http/p-TermsAndConditions/f:ts/1355360972551/Put,
> > avgKeyLen=83, avgValueLen=2880, entries=64207, length=191013462,
> >
> >
> cur=pl.allegro:http/swiat-mebli-komoda-srebrna-61980-k14-i2787252255.html/f:cnt/1355480654644/Put/vlen=62169]
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:89)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
> >         at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
> >         at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
> > Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
> >
> >
> file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358
> > at 28099584
> >         at
> > org.apache.hadoop.fs.FSInputChecker.verifySum(FSInputChecker.java:277)
> >         at
> >
> >
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:241)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
> >         at
> > org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
> >         at java.io.DataInputStream.read(DataInputStream.java:132)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(BoundedRangeFileInputStream.java:105)
> >         at
> java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> >         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
> >         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1276)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
> >         ... 12 more
> >
> > java.io.IOException: java.io.IOException:
> > java.lang.IllegalArgumentException
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:997)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:986)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1887)
> >         at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
> >         at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
> > Caused by: java.lang.IllegalArgumentException
> >         at java.nio.Buffer.position(Buffer.java:218)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1266)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
> >         at
> >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
> >         ... 5 more
> >
> > We do not use special configuration. Only hbase.rootdir and
> > hbase.zookeeper.property.dataDir are specified in hbase-site.
> > What can be a reason for such behavior?
> >
> > Thanks in advance
> >
>

Re: checksum exception

Posted by Ted Yu <yu...@gmail.com>.
The exception came from FSInputChecker.verifySum().

Can you check Namenode log around the time the error happened to find more
information ?
What hadoop version do you use ?

BTW 0.90.6 is really old. I suggest upgrading to 0.92 or 0.94

Cheers

On Sat, Dec 15, 2012 at 4:13 PM, Stanislav Orlenko
<or...@gmail.com>wrote:

> Hello
> We use HBase 0.90.6 and have the problem:
>
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact
> region server marv.site:56463 for region
>
> webpage,pl.allegro:http/stare-rozdzielacze-i2839591826.html,1355478823557.80e5ed6af22f12d45eabae89537f8602.,
> row 'pl.allegro:http/stare-rozdzielacze-i2839591826.html', but failed after
> 10 attempts.
> Exceptions:
> java.io.IOException: java.io.IOException: Could not iterate
> StoreFileScanner[HFileScanner for reader
>
> reader=file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358,
> compression=none, inMemory=false,
>
> firstKey=pl.allegro:http/stare-rozdzielacze-i2839591826.html/f:bas/1355311168483/Put,
> lastKey=pl.gumtree.lodz:http/p-TermsAndConditions/f:ts/1355360972551/Put,
> avgKeyLen=83, avgValueLen=2880, entries=64207, length=191013462,
>
> cur=pl.allegro:http/swiat-mebli-komoda-srebrna-61980-k14-i2787252255.html/f:cnt/1355480654644/Put/vlen=62169]
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:89)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
>         at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
> Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:
>
> file:/home/stas/pricex/hbase/webpage/80e5ed6af22f12d45eabae89537f8602/f/5763919690181742358
> at 28099584
>         at
> org.apache.hadoop.fs.FSInputChecker.verifySum(FSInputChecker.java:277)
>         at
>
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:241)
>         at
> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:189)
>         at
> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:158)
>         at java.io.DataInputStream.read(DataInputStream.java:132)
>         at
>
> org.apache.hadoop.hbase.io.hfile.BoundedRangeFileInputStream.read(BoundedRangeFileInputStream.java:105)
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:102)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.decompress(HFile.java:1094)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readBlock(HFile.java:1036)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1276)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
>         ... 12 more
>
> java.io.IOException: java.io.IOException:
> java.lang.IllegalArgumentException
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:997)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:986)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1887)
>         at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:570)
>         at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1039)
> Caused by: java.lang.IllegalArgumentException
>         at java.nio.Buffer.position(Buffer.java:218)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader$Scanner.next(HFile.java:1266)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:87)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:82)
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:262)
>         at
>
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:114)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.nextInternal(HRegion.java:2469)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2425)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScanner.next(HRegion.java:2442)
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1863)
>         ... 5 more
>
> We do not use special configuration. Only hbase.rootdir and
> hbase.zookeeper.property.dataDir are specified in hbase-site.
> What can be a reason for such behavior?
>
> Thanks in advance
>