You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by غلامرضا <g....@chmail.ir> on 2015/06/27 12:44:46 UTC
Could not reseek StoreFileScanner
hi
i got this exception in reduce task when task try to incement table.
Jun 27 12:42:56 10.3.72.94 [INFO]-2015/06/27 12:42:56-AsyncProcess.logAndResubmit(713) - #10486, table=table1, attempt=10/35 failed 4 ops, last exception: java.io.IOException: java.io.IOException: Could not reseek StoreFileScanner[HFileScanner for reader reader=hdfs://m2/hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_, compression=snappy, cacheConf=CacheConfig:enabled [cacheDataOnRead=true] [cacheDataOnWrite=false] [cacheIndexesOnWrite=false] [cacheBloomsOnWrite=false] [cacheEvictOnClose=false] [cacheDataCompressed=false] [prefetchOnOpen=false], ...
Jun 27 12:42:56 10.3.72.94 ...firstKey=\x00KEY1\x013yQ/c2:\x03\x00\x03^D\xA9\xC4/1435136203460/Put, lastKey=\x00KEYN\x013yS/c2:\x03\x00\x02\xAE~A\xE0/1435136896864/Put, avgKeyLen=36, avgValueLen=68, entries=15350817, length=466678923, cur=\x00KEY2\x013yT/c2:/OLDEST_TIMESTAMP/Minimum/vlen=0/mvcc=0] to key \x00KEY3\x013yT/c2:\x00fhamrah/LATEST_TIMESTAMP/Maximum/vlen=0/mvcc=0
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:184)
at org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
at org...
Jun 27 12:42:56 10.3.72.94 ....apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.requestSeek(KeyValueHeap.java:269)
at org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:741)
at org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:729)
at org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:546)
at org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerIm...
Jun 27 12:42:56 10.3.72.94 ...pl.populateResult(HRegion.java:4103)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:4183)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:4061)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4030)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4017)
at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:5010)
at org.apache.hadoop.hbase.regionserver.HRegion.increment(HRegion.java:5611)
at org.apache.hado...
Jun 27 12:42:56 10.3.72.94 ...op.hbase.regionserver.HRegionServer.increment(HRegionServer.java:4452)
at org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3673)
at org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3607)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:30954)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2093)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcE...
Jun 27 12:42:56 10.3.72.94 ...xecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Failed to read compressed block at 302546975, onDiskSizeWithoutHeader=67898, preReadHeaderSize=33, header.length=33, header bytes: DATABLK*\x00\x00D\x16\x00\x01\x00Q\x00\x00\x00\x00\x01}\x1D\x98\x01\x00\x00@\x00\x00\x00D/
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1549)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:141...
Jun 27 12:42:56 10.3.72.94 ...3)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:394)
at org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:253)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:539)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:587)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(St...
Jun 27 12:42:56 10.3.72.94 ...oreFileScanner.java:173)
... 23 more
Caused by: java.io.IOException: Invalid HFile block magic: \x00\x00\x00\x00\x00\x00\x00\x00
at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154)
at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:165)
at org.apache.hadoop.hbase.io.hfile.HFileBlock.<init>(HFileBlock.java:252)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1546)
... 30 more
on n129,60020,1434448472071, tracking started Sat Jun 27 12:42:28 IRDT 2015, retrying after 10056 ms, replay 4 ops.
but when i checked the store file with "hbase hfile", it is ok
hbase hfile -v -f /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
Scanning -> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
2015-06-27 14:02:39,241 INFO [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2015-06-27 14:02:39,392 WARN [main] snappy.LoadSnappy: Snappy native library is available
2015-06-27 14:02:39,394 INFO [main] util.NativeCodeLoader: Loaded the native-hadoop library
2015-06-27 14:02:39,394 INFO [main] snappy.LoadSnappy: Snappy native library loaded
2015-06-27 14:02:39,397 INFO [main] compress.CodecPool: Got brand-new decompressor
Scanned kv count -> 15350817
Re: Could not reseek StoreFileScanner
Posted by غلامرضا <g....@chmail.ir>.
jdk 1.7
hadoop 1.2.1
hbase 0.98.12.1
linux ubuntu 12.04.1 amd64
i check DN/NN logs and not founded any depended log.
this error occured after every load.
------------------------------ message ------------------------------
from: Vladimir Rodionov <vl...@gmail.com>
to: user@hbase.apache.org
subject: Re: Could not reseek StoreFileScanner
Check your NameNode/DataNode log files. There will be some additional bits
of information there.
-Vlad
On Sat, Jun 27, 2015 at 6:14 AM, Ted Yu wrote:
> Please provide a bit more information:
>
> the hbase / hadoop release you use
> the type of data block encoding for the table
>
> How often did this happen ?
>
> thanks
>
>
> On Sat, Jun 27, 2015 at 3:44 AM, غلامرضا wrote:
>
> > hi
> >
> > i got this exception in reduce task when task try to incement table.
> >
> > Jun 27 12:42:56 10.3.72.94 [INFO]-2015/06/27
> > 12:42:56-AsyncProcess.logAndResubmit(713) - #10486, table=table1,
> > attempt=10/35 failed 4 ops, last exception: java.io.IOException:
> > java.io.IOException: Could not reseek StoreFileScanner[HFileScanner for
> > reader
> >
> reader=hdfs://m2/hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_,
> > compression=snappy, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> > [cacheDataCompressed=false] [prefetchOnOpen=false], ...
> > Jun 27 12:42:56 10.3.72.94
> > ...firstKey=\x00KEY1\x013yQ/c2:\x03\x00\x03^D\xA9\xC4/1435136203460/Put,
> > lastKey=\x00KEYN\x013yS/c2:\x03\x00\x02\xAE~A\xE0/1435136896864/Put,
> > avgKeyLen=36, avgValueLen=68, entries=15350817, length=466678923,
> > cur=\x00KEY2\x013yT/c2:/OLDEST_TIMESTAMP/Minimum/vlen=0/mvcc=0] to key
> > \x00KEY3\x013yT/c2:\x00fhamrah/LATEST_TIMESTAMP/Maximum/vlen=0/mvcc=0
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:184)
> > at
> >
> org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
> > at org...
> > Jun 27 12:42:56 10.3.72.94
> >
> ....apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
> > at
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.requestSeek(KeyValueHeap.java:269)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:741)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:729)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:546)
> > at
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerIm...
> > Jun 27 12:42:56 10.3.72.94 ...pl.populateResult(HRegion.java:4103)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:4183)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:4061)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4030)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4017)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:5010)
> > at
> > org.apache.hadoop.hbase.regionserver.HRegion.increment(HRegion.java:5611)
> > at org.apache.hado...
> > Jun 27 12:42:56 10.3.72.94
> > ...op.hbase.regionserver.HRegionServer.increment(HRegionServer.java:4452)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3673)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3607)
> > at
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:30954)
> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2093)
> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> > at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcE...
> > Jun 27 12:42:56 10.3.72.94 ...xecutor.java:130)
> > at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> > at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: Failed to read compressed block at
> > 302546975, onDiskSizeWithoutHeader=67898, preReadHeaderSize=33,
> > header.length=33, header bytes:
> >
> DATABLK*\x00\x00D\x16\x00\x01\x00Q\x00\x00\x00\x00\x01}\x1D\x98\x01\x00\x00@
> > \x00\x00\x00D/
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1549)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:141...
> > Jun 27 12:42:56 10.3.72.94 ...3)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:394)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:253)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:539)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:587)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
> > at org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(St...
> > Jun 27 12:42:56 10.3.72.94 ...oreFileScanner.java:173)
> > ... 23 more
> > Caused by: java.io.IOException: Invalid HFile block magic:
> > \x00\x00\x00\x00\x00\x00\x00\x00
> > at
> org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154)
> > at
> org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:165)
> > at
> > org.apache.hadoop.hbase.io.hfile.HFileBlock.(HFileBlock.java:252)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1546)
> > ... 30 more
> > on n129,60020,1434448472071, tracking started Sat Jun 27 12:42:28 IRDT
> > 2015, retrying after 10056 ms, replay 4 ops.
> >
> > but when i checked the store file with "hbase hfile", it is ok
> >
> > hbase hfile -v -f
> >
> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
> > Scanning ->
> >
> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
> > 2015-06-27 14:02:39,241 INFO [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2015-06-27 14:02:39,392 WARN [main] snappy.LoadSnappy: Snappy native
> > library is available
> > 2015-06-27 14:02:39,394 INFO [main] util.NativeCodeLoader: Loaded the
> > native-hadoop library
> > 2015-06-27 14:02:39,394 INFO [main] snappy.LoadSnappy: Snappy native
> > library loaded
> > 2015-06-27 14:02:39,397 INFO [main] compress.CodecPool: Got brand-new
> > decompressor
> > Scanned kv count -> 15350817
> >
>
Re: Could not reseek StoreFileScanner
Posted by Vladimir Rodionov <vl...@gmail.com>.
Check your NameNode/DataNode log files. There will be some additional bits
of information there.
-Vlad
On Sat, Jun 27, 2015 at 6:14 AM, Ted Yu <yu...@gmail.com> wrote:
> Please provide a bit more information:
>
> the hbase / hadoop release you use
> the type of data block encoding for the table
>
> How often did this happen ?
>
> thanks
>
>
> On Sat, Jun 27, 2015 at 3:44 AM, غلامرضا <g....@chmail.ir> wrote:
>
> > hi
> >
> > i got this exception in reduce task when task try to incement table.
> >
> > Jun 27 12:42:56 10.3.72.94 [INFO]-2015/06/27
> > 12:42:56-AsyncProcess.logAndResubmit(713) - #10486, table=table1,
> > attempt=10/35 failed 4 ops, last exception: java.io.IOException:
> > java.io.IOException: Could not reseek StoreFileScanner[HFileScanner for
> > reader
> >
> reader=hdfs://m2/hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_,
> > compression=snappy, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> > [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> > [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> > [cacheDataCompressed=false] [prefetchOnOpen=false], ...
> > Jun 27 12:42:56 10.3.72.94
> > ...firstKey=\x00KEY1\x013yQ/c2:\x03\x00\x03^D\xA9\xC4/1435136203460/Put,
> > lastKey=\x00KEYN\x013yS/c2:\x03\x00\x02\xAE~A\xE0/1435136896864/Put,
> > avgKeyLen=36, avgValueLen=68, entries=15350817, length=466678923,
> > cur=\x00KEY2\x013yT/c2:/OLDEST_TIMESTAMP/Minimum/vlen=0/mvcc=0] to key
> > \x00KEY3\x013yT/c2:\x00fhamrah/LATEST_TIMESTAMP/Maximum/vlen=0/mvcc=0
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:184)
> > at
> >
> org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
> > at org...
> > Jun 27 12:42:56 10.3.72.94
> >
> ....apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
> > at
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.requestSeek(KeyValueHeap.java:269)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:741)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:729)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:546)
> > at
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> > at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerIm...
> > Jun 27 12:42:56 10.3.72.94 ...pl.populateResult(HRegion.java:4103)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:4183)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:4061)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4030)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4017)
> > at
> org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:5010)
> > at
> > org.apache.hadoop.hbase.regionserver.HRegion.increment(HRegion.java:5611)
> > at org.apache.hado...
> > Jun 27 12:42:56 10.3.72.94
> > ...op.hbase.regionserver.HRegionServer.increment(HRegionServer.java:4452)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3673)
> > at
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3607)
> > at
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:30954)
> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2093)
> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> > at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcE...
> > Jun 27 12:42:56 10.3.72.94 ...xecutor.java:130)
> > at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> > at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: Failed to read compressed block at
> > 302546975, onDiskSizeWithoutHeader=67898, preReadHeaderSize=33,
> > header.length=33, header bytes:
> >
> DATABLK*\x00\x00D\x16\x00\x01\x00Q\x00\x00\x00\x00\x01}\x1D\x98\x01\x00\x00@
> > \x00\x00\x00D/
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1549)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:141...
> > Jun 27 12:42:56 10.3.72.94 ...3)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:394)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:253)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:539)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:587)
> > at
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
> > at org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(St...
> > Jun 27 12:42:56 10.3.72.94 ...oreFileScanner.java:173)
> > ... 23 more
> > Caused by: java.io.IOException: Invalid HFile block magic:
> > \x00\x00\x00\x00\x00\x00\x00\x00
> > at
> org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154)
> > at
> org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:165)
> > at
> > org.apache.hadoop.hbase.io.hfile.HFileBlock.<init>(HFileBlock.java:252)
> > at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1546)
> > ... 30 more
> > on n129,60020,1434448472071, tracking started Sat Jun 27 12:42:28 IRDT
> > 2015, retrying after 10056 ms, replay 4 ops.
> >
> > but when i checked the store file with "hbase hfile", it is ok
> >
> > hbase hfile -v -f
> >
> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
> > Scanning ->
> >
> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
> > 2015-06-27 14:02:39,241 INFO [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2015-06-27 14:02:39,392 WARN [main] snappy.LoadSnappy: Snappy native
> > library is available
> > 2015-06-27 14:02:39,394 INFO [main] util.NativeCodeLoader: Loaded the
> > native-hadoop library
> > 2015-06-27 14:02:39,394 INFO [main] snappy.LoadSnappy: Snappy native
> > library loaded
> > 2015-06-27 14:02:39,397 INFO [main] compress.CodecPool: Got brand-new
> > decompressor
> > Scanned kv count -> 15350817
> >
>
Re: Could not reseek StoreFileScanner
Posted by Ted Yu <yu...@gmail.com>.
Please provide a bit more information:
the hbase / hadoop release you use
the type of data block encoding for the table
How often did this happen ?
thanks
On Sat, Jun 27, 2015 at 3:44 AM, غلامرضا <g....@chmail.ir> wrote:
> hi
>
> i got this exception in reduce task when task try to incement table.
>
> Jun 27 12:42:56 10.3.72.94 [INFO]-2015/06/27
> 12:42:56-AsyncProcess.logAndResubmit(713) - #10486, table=table1,
> attempt=10/35 failed 4 ops, last exception: java.io.IOException:
> java.io.IOException: Could not reseek StoreFileScanner[HFileScanner for
> reader
> reader=hdfs://m2/hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_,
> compression=snappy, cacheConf=CacheConfig:enabled [cacheDataOnRead=true]
> [cacheDataOnWrite=false] [cacheIndexesOnWrite=false]
> [cacheBloomsOnWrite=false] [cacheEvictOnClose=false]
> [cacheDataCompressed=false] [prefetchOnOpen=false], ...
> Jun 27 12:42:56 10.3.72.94
> ...firstKey=\x00KEY1\x013yQ/c2:\x03\x00\x03^D\xA9\xC4/1435136203460/Put,
> lastKey=\x00KEYN\x013yS/c2:\x03\x00\x02\xAE~A\xE0/1435136896864/Put,
> avgKeyLen=36, avgValueLen=68, entries=15350817, length=466678923,
> cur=\x00KEY2\x013yT/c2:/OLDEST_TIMESTAMP/Minimum/vlen=0/mvcc=0] to key
> \x00KEY3\x013yT/c2:\x00fhamrah/LATEST_TIMESTAMP/Maximum/vlen=0/mvcc=0
> at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(StoreFileScanner.java:184)
> at
> org.apache.hadoop.hbase.regionserver.NonLazyKeyValueScanner.doRealSeek(NonLazyKeyValueScanner.java:55)
> at org...
> Jun 27 12:42:56 10.3.72.94
> ....apache.hadoop.hbase.regionserver.KeyValueHeap.generalizedSeek(KeyValueHeap.java:313)
> at
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.requestSeek(KeyValueHeap.java:269)
> at
> org.apache.hadoop.hbase.regionserver.StoreScanner.reseek(StoreScanner.java:741)
> at
> org.apache.hadoop.hbase.regionserver.StoreScanner.seekAsDirection(StoreScanner.java:729)
> at
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:546)
> at
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:140)
> at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerIm...
> Jun 27 12:42:56 10.3.72.94 ...pl.populateResult(HRegion.java:4103)
> at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:4183)
> at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:4061)
> at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4030)
> at
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:4017)
> at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:5010)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.increment(HRegion.java:5611)
> at org.apache.hado...
> Jun 27 12:42:56 10.3.72.94
> ...op.hbase.regionserver.HRegionServer.increment(HRegionServer.java:4452)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.doNonAtomicRegionMutation(HRegionServer.java:3673)
> at
> org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3607)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:30954)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2093)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcE...
> Jun 27 12:42:56 10.3.72.94 ...xecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.IOException: Failed to read compressed block at
> 302546975, onDiskSizeWithoutHeader=67898, preReadHeaderSize=33,
> header.length=33, header bytes:
> DATABLK*\x00\x00D\x16\x00\x01\x00Q\x00\x00\x00\x00\x01}\x1D\x98\x01\x00\x00@
> \x00\x00\x00D/
> at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1549)
> at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:141...
> Jun 27 12:42:56 10.3.72.94 ...3)
> at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:394)
> at
> org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:253)
> at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:539)
> at
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.reseekTo(HFileReaderV2.java:587)
> at
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseekAtOrAfter(StoreFileScanner.java:257)
> at org.apache.hadoop.hbase.regionserver.StoreFileScanner.reseek(St...
> Jun 27 12:42:56 10.3.72.94 ...oreFileScanner.java:173)
> ... 23 more
> Caused by: java.io.IOException: Invalid HFile block magic:
> \x00\x00\x00\x00\x00\x00\x00\x00
> at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:154)
> at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:165)
> at
> org.apache.hadoop.hbase.io.hfile.HFileBlock.<init>(HFileBlock.java:252)
> at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1546)
> ... 30 more
> on n129,60020,1434448472071, tracking started Sat Jun 27 12:42:28 IRDT
> 2015, retrying after 10056 ms, replay 4 ops.
>
> but when i checked the store file with "hbase hfile", it is ok
>
> hbase hfile -v -f
> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
> Scanning ->
> /hbase2/data/default/table1/d52beedee15de2e7bb380f14bb0929fb/c2/daa0269a1f1c44f3811a25976b9278c8_SeqId_95_
> 2015-06-27 14:02:39,241 INFO [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2015-06-27 14:02:39,392 WARN [main] snappy.LoadSnappy: Snappy native
> library is available
> 2015-06-27 14:02:39,394 INFO [main] util.NativeCodeLoader: Loaded the
> native-hadoop library
> 2015-06-27 14:02:39,394 INFO [main] snappy.LoadSnappy: Snappy native
> library loaded
> 2015-06-27 14:02:39,397 INFO [main] compress.CodecPool: Got brand-new
> decompressor
> Scanned kv count -> 15350817
>