You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by Lars George <la...@gmail.com> on 2011/07/05 11:31:11 UTC
Exception in TableInfo check
Hi,
I am managing consistently to run into this error:
==> /var/lib/hbase/logs/hbase-larsgeorge-master-de1-app-mbp-2.log <==
2011-07-05 11:26:00,758 WARN org.apache.hadoop.hbase.master.HMaster: Failed
getting all descriptors
java.io.FileNotFoundException: No status for
hdfs://localhost:8020/hbase/.corrupt
at
org.apache.hadoop.hbase.util.FSUtils.getTableInfoModtime(FSUtils.java:888)
at
org.apache.hadoop.hbase.util.FSTableDescriptors.get(FSTableDescriptors.java:122)
at
org.apache.hadoop.hbase.util.FSTableDescriptors.getAll(FSTableDescriptors.java:149)
at
org.apache.hadoop.hbase.master.HMaster.getHTableDescriptors(HMaster.java:1429)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:312)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1065)
Why is the TableInfo check failing on .corrupt? I do have it
drwxr-xr-x - larsgeorge supergroup 0 2011-07-05 09:29
/hbase/.corrupt
Is this file not skipped and presumed to be a table directory?
Lars
PS: Just as a note, I am posting these things here because I am currently
have no time to investigate and fix (if necessary). They seem trivial, but I
just cannot deviate from my current assignment. Sorry for being a PITA. :(
Re: Exception in TableInfo check
Posted by Lars George <la...@gmail.com>.
Created HBASE-4061
On Tue, Jul 5, 2011 at 2:38 AM, Lars George <la...@gmail.com> wrote:
> Ah,
>
> public static List<Path> getTableDirs(final FileSystem fs, final Path
> rootdir)
> throws IOException {
> // presumes any directory under hbase.rootdir is a table
> FileStatus [] dirs = fs.listStatus(rootdir, new DirFilter(fs));
> List<Path> tabledirs = new ArrayList<Path>(dirs.length);
> for (FileStatus dir: dirs) {
> Path p = dir.getPath();
> String tableName = p.getName();
> if (tableName.equals(HConstants.HREGION_LOGDIR_NAME) ||
> tableName.equals(Bytes.toString(HConstants.ROOT_TABLE_NAME)) ||
> tableName.equals(Bytes.toString(HConstants.META_TABLE_NAME)) ||
> tableName.equals(HConstants.HREGION_OLDLOGDIR_NAME) ) {
> continue;
> }
> tabledirs.add(p);
> }
> return tabledirs;
> }
>
> This is missing .tmp and .corrupt, and splitlogs and...
>
> Lars
>
> On Jul 5, 2011, at 11:31 AM, Lars George wrote:
>
> > Hi,
> >
> > I am managing consistently to run into this error:
> >
> > ==> /var/lib/hbase/logs/hbase-larsgeorge-master-de1-app-mbp-2.log <==
> > 2011-07-05 11:26:00,758 WARN org.apache.hadoop.hbase.master.HMaster:
> Failed getting all descriptors
> > java.io.FileNotFoundException: No status for
> hdfs://localhost:8020/hbase/.corrupt
> > at
> org.apache.hadoop.hbase.util.FSUtils.getTableInfoModtime(FSUtils.java:888)
> > at
> org.apache.hadoop.hbase.util.FSTableDescriptors.get(FSTableDescriptors.java:122)
> > at
> org.apache.hadoop.hbase.util.FSTableDescriptors.getAll(FSTableDescriptors.java:149)
> > at
> org.apache.hadoop.hbase.master.HMaster.getHTableDescriptors(HMaster.java:1429)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> > at
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:312)
> > at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1065)
> >
> > Why is the TableInfo check failing on .corrupt? I do have it
> >
> > drwxr-xr-x - larsgeorge supergroup 0 2011-07-05 09:29
> /hbase/.corrupt
> >
> > Is this file not skipped and presumed to be a table directory?
> >
> > Lars
> >
> > PS: Just as a note, I am posting these things here because I am currently
> have no time to investigate and fix (if necessary). They seem trivial, but I
> just cannot deviate from my current assignment. Sorry for being a PITA. :(
>
>
Re: Exception in TableInfo check
Posted by Lars George <la...@gmail.com>.
Ah,
public static List<Path> getTableDirs(final FileSystem fs, final Path rootdir)
throws IOException {
// presumes any directory under hbase.rootdir is a table
FileStatus [] dirs = fs.listStatus(rootdir, new DirFilter(fs));
List<Path> tabledirs = new ArrayList<Path>(dirs.length);
for (FileStatus dir: dirs) {
Path p = dir.getPath();
String tableName = p.getName();
if (tableName.equals(HConstants.HREGION_LOGDIR_NAME) ||
tableName.equals(Bytes.toString(HConstants.ROOT_TABLE_NAME)) ||
tableName.equals(Bytes.toString(HConstants.META_TABLE_NAME)) ||
tableName.equals(HConstants.HREGION_OLDLOGDIR_NAME) ) {
continue;
}
tabledirs.add(p);
}
return tabledirs;
}
This is missing .tmp and .corrupt, and splitlogs and...
Lars
On Jul 5, 2011, at 11:31 AM, Lars George wrote:
> Hi,
>
> I am managing consistently to run into this error:
>
> ==> /var/lib/hbase/logs/hbase-larsgeorge-master-de1-app-mbp-2.log <==
> 2011-07-05 11:26:00,758 WARN org.apache.hadoop.hbase.master.HMaster: Failed getting all descriptors
> java.io.FileNotFoundException: No status for hdfs://localhost:8020/hbase/.corrupt
> at org.apache.hadoop.hbase.util.FSUtils.getTableInfoModtime(FSUtils.java:888)
> at org.apache.hadoop.hbase.util.FSTableDescriptors.get(FSTableDescriptors.java:122)
> at org.apache.hadoop.hbase.util.FSTableDescriptors.getAll(FSTableDescriptors.java:149)
> at org.apache.hadoop.hbase.master.HMaster.getHTableDescriptors(HMaster.java:1429)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:312)
> at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1065)
>
> Why is the TableInfo check failing on .corrupt? I do have it
>
> drwxr-xr-x - larsgeorge supergroup 0 2011-07-05 09:29 /hbase/.corrupt
>
> Is this file not skipped and presumed to be a table directory?
>
> Lars
>
> PS: Just as a note, I am posting these things here because I am currently have no time to investigate and fix (if necessary). They seem trivial, but I just cannot deviate from my current assignment. Sorry for being a PITA. :(