You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Billy <sa...@pearsonwholesale.com> on 2007/12/12 10:42:43 UTC

EOFException in hbase

Any idea if this can be fixed with out loseing data or why it would happen?


Master logs keeps trying to assign the regions but the region server keeps 
filing with the error below.

2007-12-12 03:38:19,843 INFO org.apache.hadoop.hbase.HRegionServer: 
MSG_REGION_OPEN : regionname: webdata,,1197386306044, startKey: <>, 
tableDesc: {name: webdata, families: {in_links:={name: in_links, max 
versions: 1, compression: BLOCK, in memory: false, max length: 2147483647, 
bloom filter: none}, out_links:={name: out_links, max versions: 1, 
compression: BLOCK, in memory: false, max length: 2147483647, bloom filter: 
none}, rank_total:={name: rank_total, max versions: 1, compression: BLOCK, 
in memory: false, max length: 2147483647, bloom filter: none}, stime:={name: 
stime, max versions: 1, compression: BLOCK, in memory: false, max length: 
2147483647, bloom filter: none}}}
2007-12-12 03:38:20,243 ERROR org.apache.hadoop.hbase.HRegionServer: unable 
to process message: MSG_REGION_OPEN : regionname: webdata,,1197386306044, 
startKey: <>, tableDesc: {name: webdata, families: {in_links:={name: 
in_links, max versions: 1, compression: BLOCK, in memory: false, max length: 
2147483647, bloom filter: none}, out_links:={name: out_links, max versions: 
1, compression: BLOCK, in memory: false, max length: 2147483647, bloom 
filter: none}, rank_total:={name: rank_total, max versions: 1, compression: 
BLOCK, in memory: false, max length: 2147483647, bloom filter: none}, 
stime:={name: stime, max versions: 1, compression: BLOCK, in memory: false, 
max length: 2147483647, bloom filter: none}}}
java.io.EOFException
        at java.io.DataInputStream.readFully(DataInputStream.java:178)
        at java.io.DataInputStream.readFully(DataInputStream.java:152)
        at 
org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1383)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1360)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1349)
        at 
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1344)
        at 
org.apache.hadoop.hbase.HStore.doReconstructionLog(HStore.java:678)
        at org.apache.hadoop.hbase.HStore.<init>(HStore.java:613)
        at org.apache.hadoop.hbase.HRegion.<init>(HRegion.java:287)
        at 
org.apache.hadoop.hbase.HRegionServer.openRegion(HRegionServer.java:1158)
        at 
org.apache.hadoop.hbase.HRegionServer$Worker.run(HRegionServer.java:1110)
        at java.lang.Thread.run(Thread.java:595)




Re: EOFException in hbase

Posted by stack <st...@duboce.net>.
Did you update your hbase and use data written by a previous version of 
hbase?  If so, I'd guess the EOF is because of the incompatible changes 
listed in CHANGES.txt.  Otherwise, anything earlier in the log 
complaining of failed writes to hdfs?
St.Ack


Billy wrote:
> Any idea if this can be fixed with out loseing data or why it would happen?
>
>
> Master logs keeps trying to assign the regions but the region server keeps 
> filing with the error below.
>
> 2007-12-12 03:38:19,843 INFO org.apache.hadoop.hbase.HRegionServer: 
> MSG_REGION_OPEN : regionname: webdata,,1197386306044, startKey: <>, 
> tableDesc: {name: webdata, families: {in_links:={name: in_links, max 
> versions: 1, compression: BLOCK, in memory: false, max length: 2147483647, 
> bloom filter: none}, out_links:={name: out_links, max versions: 1, 
> compression: BLOCK, in memory: false, max length: 2147483647, bloom filter: 
> none}, rank_total:={name: rank_total, max versions: 1, compression: BLOCK, 
> in memory: false, max length: 2147483647, bloom filter: none}, stime:={name: 
> stime, max versions: 1, compression: BLOCK, in memory: false, max length: 
> 2147483647, bloom filter: none}}}
> 2007-12-12 03:38:20,243 ERROR org.apache.hadoop.hbase.HRegionServer: unable 
> to process message: MSG_REGION_OPEN : regionname: webdata,,1197386306044, 
> startKey: <>, tableDesc: {name: webdata, families: {in_links:={name: 
> in_links, max versions: 1, compression: BLOCK, in memory: false, max length: 
> 2147483647, bloom filter: none}, out_links:={name: out_links, max versions: 
> 1, compression: BLOCK, in memory: false, max length: 2147483647, bloom 
> filter: none}, rank_total:={name: rank_total, max versions: 1, compression: 
> BLOCK, in memory: false, max length: 2147483647, bloom filter: none}, 
> stime:={name: stime, max versions: 1, compression: BLOCK, in memory: false, 
> max length: 2147483647, bloom filter: none}}}
> java.io.EOFException
>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
>         at java.io.DataInputStream.readFully(DataInputStream.java:152)
>         at 
> org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1383)
>         at 
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1360)
>         at 
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1349)
>         at 
> org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1344)
>         at 
> org.apache.hadoop.hbase.HStore.doReconstructionLog(HStore.java:678)
>         at org.apache.hadoop.hbase.HStore.<init>(HStore.java:613)
>         at org.apache.hadoop.hbase.HRegion.<init>(HRegion.java:287)
>         at 
> org.apache.hadoop.hbase.HRegionServer.openRegion(HRegionServer.java:1158)
>         at 
> org.apache.hadoop.hbase.HRegionServer$Worker.run(HRegionServer.java:1110)
>         at java.lang.Thread.run(Thread.java:595)
>
>
>
>