You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Weizhan Zeng <qg...@gmail.com> on 2017/11/25 14:21:50 UTC

Re: hbase.bucketcache.bucket.sizes had set multiple of 1024 but still got "Invalid HFile block magic"

sorry , I had  a  calculate  mistake。  Just ignore it ...

2017-11-25 20:52 GMT+08:00 Weizhan Zeng <qg...@gmail.com>:

> Hi , guys
>    In https://issues.apache.org/jira/browse/HBASE-16993 , I found that
>
> hbase.bucketcache.bucket.sizes must set multiple of 1024, But when I set
>
>   <property>
>     <name>hbase.bucketcache.bucket.sizes</name>
>     <value>6144,9216,41984,50176,58368,66560,99328,132096,
> 198211,263168,394240,525312,1049600,2099200</value>
>   </property>
>
> And I still got  error :
>
>
> 2017-11-25 20:37:37,222 ERROR [B.defaultRpcServer.handler=20,queue=1,port=60020]
> bucket.BucketCache: Failed reading block d444ab4b244140c199f23a3870f59136_250591965
> from bucket cache
> java.io.IOException: Invalid HFile block magic:
> \x00\x00\x00\x00\x00\x00\x00\x00
> at org.apache.hadoop.hbase.io.hfile.BlockType.parse(BlockType.java:155)
> at org.apache.hadoop.hbase.io.hfile.BlockType.read(BlockType.java:167)
> at org.apache.hadoop.hbase.io.hfile.HFileBlock.<init>(HFileBlock.java:275)
> at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.
> deserialize(HFileBlock.java:136)
> at org.apache.hadoop.hbase.io.hfile.HFileBlock$1.
> deserialize(HFileBlock.java:123)
> at org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.
> getBlock(BucketCache.java:428)
> at org.apache.hadoop.hbase.io.hfile.CombinedBlockCache.
> getBlock(CombinedBlockCache.java:85)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.
> getCachedBlock(HFileReaderV2.java:278)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(
> HFileReaderV2.java:418)
> at org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.
> loadDataBlockWithScanInfo(HFileBlockIndex.java:271)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$
> AbstractScannerV2.seekTo(HFileReaderV2.java:649)
> at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$
> AbstractScannerV2.seekTo(HFileReaderV2.java:599)
> at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(
> StoreFileScanner.java:268)
> at org.apache.hadoop.hbase.regionserver.StoreFileScanner.
> seek(StoreFileScanner.java:173)
> at org.apache.hadoop.hbase.regionserver.StoreScanner.
> seekScanners(StoreScanner.java:350)
> at org.apache.hadoop.hbase.regionserver.StoreScanner.<
> init>(StoreScanner.java:199)
> at org.apache.hadoop.hbase.regionserver.HStore.
> getScanner(HStore.java:2077)
> at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(
> HRegion.java:5556)
> at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(
> HRegion.java:2574)
> at org.apache.hadoop.hbase.regionserver.HRegion.
> getScanner(HRegion.java:2560)
> at org.apache.hadoop.hbase.regionserver.HRegion.
> getScanner(HRegion.java:2541)
> at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6830)
> at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6809)
> at org.apache.hadoop.hbase.regionserver.RSRpcServices.
> get(RSRpcServices.java:2049)
> at org.apache.hadoop.hbase.protobuf.generated.
> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33644)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2196)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
> at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
> RpcExecutor.java:133)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
> at java.lang.Thread.run(Thread.java:748)
>
>
> Is there anything I missed ?
>