You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by Ted Yu <yu...@gmail.com> on 2010/07/03 16:23:20 UTC

compiling HBaseFsck.java for 0.20.5

Hi,
I tried to compile HBaseFsck.java for 0.20.5 but got:

compile-core:
    [javac] Compiling 338 source files to
/Users/tyu/hbase-0.20.5/build/classes
    [javac]
/Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
cannot find symbol
    [javac] symbol  : constructor
HBaseAdmin(org.apache.hadoop.conf.Configuration)
    [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
    [javac]     super(conf);
    [javac]     ^
    [javac]
/Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
cannot find symbol
    [javac] symbol  : method
metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
    [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
    [javac]       MetaScanner.metaScan(conf, visitor);
    [javac]                  ^
    [javac]
/Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
cannot find symbol
    [javac] symbol  : method create()
    [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
    [javac]     Configuration conf = HBaseConfiguration.create();
    [javac]                                            ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 3 errors

Advice is welcome.

Re: compiling HBaseFsck.java for 0.20.5

Posted by Ted Yu <yu...@gmail.com>.
Although an entry is added to catalog table , I don't see it on
:60010/master.jsp

10/07/06 09:52:16 DEBUG client.HConnectionManager$TableServers: Found ROOT
at 10.32.56.159:60020
10/07/06 09:52:16 DEBUG client.HConnectionManager$TableServers: Cached
location for .META.,,1 is 10.32.56.160:60020
10/07/06 09:52:16 DEBUG client.HTable$ClientScanner: Creating scanner over
.META. starting at key 'TRIAL-ERRORS-1277252980233-0,,'
10/07/06 09:52:16 DEBUG client.HTable$ClientScanner: Advancing internal
scanner to startKey at 'TRIAL-ERRORS-1277252980233-0,,'
10/07/06 09:52:16 DEBUG client.HTable$ClientScanner: Finished with scanning
at REGION => {NAME => '.META.,,1', STARTKEY => '', ENDKEY => '', ENCODED =>
1028785192, TABLE => {{NAME => '.META.', IS_META => 'true',
MEMSTORE_FLUSHSIZE => '16384', FAMILIES => [{NAME => 'historian', VERSIONS
=> '2147483647', COMPRESSION => 'NONE', TTL => '604800', BLOCKSIZE =>
'8192', IN_MEMORY => 'false', BLOCKCACHE => 'false'}, {NAME => 'info',
VERSIONS => '10', COMPRESSION => 'NONE', TTL => '2147483647', BLOCKSIZE =>
'8192', IN_MEMORY => 'false', BLOCKCACHE => 'false'}]}}
10/07/06 09:52:16 INFO add_table: Walking hdfs://
sjc9-flash-grid04.ciq.com:9000/hbase/TRIAL-ERRORS-1277252980233-0 adding
regions to catalog table
10/07/06 09:52:16 INFO add_table: Added to catalog: REGION => {NAME =>
'TRIAL-ERRORS-1277252980233-0,,1277252990116', STARTKEY => '', ENDKEY => '',
ENCODED => 1682051688, TABLE => {{NAME => 'TRIAL-ERRORS-1277252980233-0',
FAMILIES => [{NAME => 'd', COMPRESSION => 'GZ', VERSIONS => '1', TTL =>
'31536000', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE =>
'true'}, {NAME => 'i', COMPRESSION => 'GZ', VERSIONS => '1', TTL =>
'31536000', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE =>
'true'}, {NAME => 'v', COMPRESSION => 'GZ', VERSIONS => '1', TTL =>
'31536000', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE =>
'true'}]}}
10/07/06 09:52:16 INFO zookeeper.ZooKeeper: Closing session:
0x1299926deb3000a
10/07/06 09:52:16 INFO zookeeper.ClientCnxn: Closing ClientCnxn for session:
0x1299926deb3000a
10/07/06 09:52:16 INFO zookeeper.ClientCnxn: Exception while closing send
thread for session 0x1299926deb3000a : Read error rc = -1
java.nio.DirectByteBuffer[pos=0 lim=4 cap=4]
10/07/06 09:52:16 INFO zookeeper.ClientCnxn: Disconnecting ClientCnxn for
session: 0x1299926deb3000a
10/07/06 09:52:16 INFO zookeeper.ClientCnxn: EventThread shut down
10/07/06 09:52:16 INFO zookeeper.ZooKeeper: Session: 0x1299926deb3000a
closed
10/07/06 09:52:16 DEBUG zookeeper.ZooKeeperWrapper: Closed connection with
ZooKeeper


On Tue, Jul 6, 2010 at 9:54 AM, Jean-Daniel Cryans <jd...@apache.org>wrote:

> bin/add_table.rb
>
> J-D
>
> On Tue, Jul 6, 2010 at 9:44 AM, Ted Yu <yu...@gmail.com> wrote:
>
> > That fixes the issue.
> >
> > HBaseFsck found missing tables after scanning hdfs (possibly from
> previous
> > release of HBase - I installed 0.20.5 recently):
> > ERROR: Path hdfs://
> > sjc9-flash-grid04.ciq.com:9000/hbase/TRIAL-ERRORS-1277252980233-0 does
> not
> > have a corresponding entry in META.
> >
> > Is there a way to add those tables back ?
> >
> > Thanks
> >
> > On Tue, Jul 6, 2010 at 9:08 AM, Stack <st...@duboce.net> wrote:
> >
> > > HBaseFsck does this:
> > >
> > >    conf.set("fs.defaultFS", conf.get("hbase.rootdir"));
> > >
> > > Add this line:
> > >
> > >    conf.set("fs.default.name", conf.get("hbase.rootdir"));
> > >
> > > See if that fixes it (The former is new way of spec'ing defaultFS
> > > while latter is oldstyle).
> > >
> > > St.Ack
> > >
> > > On Mon, Jul 5, 2010 at 6:25 PM, Ted Yu <yu...@gmail.com> wrote:
> > > > I assume the conf directory is that of HBase.
> > > >
> > > > I use this command previously:
> > > > bin/hbase hbck
> > > >
> > > > I tried this today:
> > > > bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
> > > >
> > > > Result is the same.
> > > >
> > > > I do see conf in the classpath:
> > > > 10/07/05 18:12:32 INFO zookeeper.ZooKeeper: Client
> > > > environment:java.class.path=/home/hadoop/mmp/234_x/hbase/conf:...
> > > > ...
> > > > rootDir: hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase hdfs://
> > > > sjc9-flash-grid04.carrieriq.com:9000/hbase
> > > > Version: 0.20.5
> > > > 10/07/05 18:12:32 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> > > > /hbase/root-region-server got 10.32.56.159:60020
> > > > 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Found
> > > ROOT
> > > > at 10.32.56.159:60020
> > > > 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers:
> Cached
> > > > location for .META.,,1 is 10.32.56.159:60020
> > > >
> > > > Number of Tables: 0
> > > > Number of live region servers:2
> > > > Number of dead region servers:0
> > > > Exception in thread "main" java.lang.IllegalArgumentException: Wrong
> > FS:
> > > > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected:
> file:///
> > > >        at
> > org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> > > >
> > > >
> > > > On Mon, Jul 5, 2010 at 10:24 AM, Stack <st...@duboce.net> wrote:
> > > >
> > > >> Make sure conf directory is in your classpath.  If it is, it might
> the
> > > >> case that you need something like the below:
> > > >>
> > > >> # Set hadoop filesystem configuration using the hbase.rootdir.
> > > >> # Otherwise, we'll always use localhost though the hbase.rootdir
> > > >> # might be pointing at hdfs location.
> > > >> c.set("fs.default.name", c.get(HConstants::HBASE_DIR))
> > > >> fs = FileSystem.get(c)
> > > >>
> > > >> The above is copied from the jruby scripts in the bin dir......
> > > >>
> > > >> ...though looking at the HBaseFsck it does this.
> > > >>
> > > >> So it must be a case of your not setting up the classpath properly?
> > > >>
> > > >> You've set the target hdfs in your hbase-site.xml and then you've
> > > >> launched the script as per:
> > > >>
> > > >> ./bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
> > > >>
> > > >> (The above will ensure your classpath is set properly).
> > > >>
> > > >> St.Ack
> > > >>
> > > >>
> > > >>
> > > >> On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:
> > > >> > I produced patched version of HBaseFsck.java which is attached.
> > > >> >
> > > >> > When I ran it, I got:
> > > >> >
> > > >> > Version: 0.20.5
> > > >> > 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> > > >> > /hbase/root-region-server got 10.32.56.159:60020
> > > >> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers:
> > Found
> > > >> ROOT
> > > >> > at 10.32.56.159:60020
> > > >> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers:
> > Cached
> > > >> > location for .META.,,1 is 10.32.56.160:60020
> > > >> >
> > > >> > Number of Tables: 0
> > > >> > Number of live region servers:2
> > > >> > Number of dead region servers:0
> > > >> > Exception in thread "main" java.lang.IllegalArgumentException:
> Wrong
> > > FS:
> > > >> > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected:
> > file:///
> > > >> >         at
> > > org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> > > >> >         at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
> > > >> >         at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
> > > >> >         at
> > > >> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
> > > >> >         at
> > > >> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
> > > >> >         at
> > > >> >
> > > >>
> > >
> >
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
> > > >> >         at
> > > >> >
> > org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
> > > >> >         at
> > > >> >
> org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
> > > >> >         at
> > > >> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> > > >> > 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> > > >> > 0x1299926deb30004
> > > >> >
> > > >> > Please comment.
> > > >> >
> > > >> > On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com>
> wrote:
> > > >> >>
> > > >> >> Hi,
> > > >> >> I tried to compile HBaseFsck.java for 0.20.5 but got:
> > > >> >>
> > > >> >> compile-core:
> > > >> >>     [javac] Compiling 338 source files to
> > > >> >> /Users/tyu/hbase-0.20.5/build/classes
> > > >> >>     [javac]
> > > >> >>
> > > >>
> > >
> >
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
> > > >> >> cannot find symbol
> > > >> >>     [javac] symbol  : constructor
> > > >> >> HBaseAdmin(org.apache.hadoop.conf.Configuration)
> > > >> >>     [javac] location: class
> > org.apache.hadoop.hbase.client.HBaseAdmin
> > > >> >>     [javac]     super(conf);
> > > >> >>     [javac]     ^
> > > >> >>     [javac]
> > > >> >>
> > > >>
> > >
> >
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
> > > >> >> cannot find symbol
> > > >> >>     [javac] symbol  : method
> > > >> >>
> > > >>
> > >
> >
> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
> > > >> >>     [javac] location: class
> > > org.apache.hadoop.hbase.client.MetaScanner
> > > >> >>     [javac]       MetaScanner.metaScan(conf, visitor);
> > > >> >>     [javac]                  ^
> > > >> >>     [javac]
> > > >> >>
> > > >>
> > >
> >
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
> > > >> >> cannot find symbol
> > > >> >>     [javac] symbol  : method create()
> > > >> >>     [javac] location: class
> > > org.apache.hadoop.hbase.HBaseConfiguration
> > > >> >>     [javac]     Configuration conf = HBaseConfiguration.create();
> > > >> >>     [javac]                                            ^
> > > >> >>     [javac] Note: Some input files use or override a deprecated
> > API.
> > > >> >>     [javac] Note: Recompile with -Xlint:deprecation for details.
> > > >> >>     [javac] Note: Some input files use unchecked or unsafe
> > > operations.
> > > >> >>     [javac] Note: Recompile with -Xlint:unchecked for details.
> > > >> >>     [javac] 3 errors
> > > >> >>
> > > >> >> Advice is welcome.
> > > >> >
> > > >> >
> > > >>
> > > >
> > >
> >
>

Re: compiling HBaseFsck.java for 0.20.5

Posted by Jean-Daniel Cryans <jd...@apache.org>.
bin/add_table.rb

J-D

On Tue, Jul 6, 2010 at 9:44 AM, Ted Yu <yu...@gmail.com> wrote:

> That fixes the issue.
>
> HBaseFsck found missing tables after scanning hdfs (possibly from previous
> release of HBase - I installed 0.20.5 recently):
> ERROR: Path hdfs://
> sjc9-flash-grid04.ciq.com:9000/hbase/TRIAL-ERRORS-1277252980233-0 does not
> have a corresponding entry in META.
>
> Is there a way to add those tables back ?
>
> Thanks
>
> On Tue, Jul 6, 2010 at 9:08 AM, Stack <st...@duboce.net> wrote:
>
> > HBaseFsck does this:
> >
> >    conf.set("fs.defaultFS", conf.get("hbase.rootdir"));
> >
> > Add this line:
> >
> >    conf.set("fs.default.name", conf.get("hbase.rootdir"));
> >
> > See if that fixes it (The former is new way of spec'ing defaultFS
> > while latter is oldstyle).
> >
> > St.Ack
> >
> > On Mon, Jul 5, 2010 at 6:25 PM, Ted Yu <yu...@gmail.com> wrote:
> > > I assume the conf directory is that of HBase.
> > >
> > > I use this command previously:
> > > bin/hbase hbck
> > >
> > > I tried this today:
> > > bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
> > >
> > > Result is the same.
> > >
> > > I do see conf in the classpath:
> > > 10/07/05 18:12:32 INFO zookeeper.ZooKeeper: Client
> > > environment:java.class.path=/home/hadoop/mmp/234_x/hbase/conf:...
> > > ...
> > > rootDir: hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase hdfs://
> > > sjc9-flash-grid04.carrieriq.com:9000/hbase
> > > Version: 0.20.5
> > > 10/07/05 18:12:32 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> > > /hbase/root-region-server got 10.32.56.159:60020
> > > 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Found
> > ROOT
> > > at 10.32.56.159:60020
> > > 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Cached
> > > location for .META.,,1 is 10.32.56.159:60020
> > >
> > > Number of Tables: 0
> > > Number of live region servers:2
> > > Number of dead region servers:0
> > > Exception in thread "main" java.lang.IllegalArgumentException: Wrong
> FS:
> > > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
> > >        at
> org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> > >
> > >
> > > On Mon, Jul 5, 2010 at 10:24 AM, Stack <st...@duboce.net> wrote:
> > >
> > >> Make sure conf directory is in your classpath.  If it is, it might the
> > >> case that you need something like the below:
> > >>
> > >> # Set hadoop filesystem configuration using the hbase.rootdir.
> > >> # Otherwise, we'll always use localhost though the hbase.rootdir
> > >> # might be pointing at hdfs location.
> > >> c.set("fs.default.name", c.get(HConstants::HBASE_DIR))
> > >> fs = FileSystem.get(c)
> > >>
> > >> The above is copied from the jruby scripts in the bin dir......
> > >>
> > >> ...though looking at the HBaseFsck it does this.
> > >>
> > >> So it must be a case of your not setting up the classpath properly?
> > >>
> > >> You've set the target hdfs in your hbase-site.xml and then you've
> > >> launched the script as per:
> > >>
> > >> ./bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
> > >>
> > >> (The above will ensure your classpath is set properly).
> > >>
> > >> St.Ack
> > >>
> > >>
> > >>
> > >> On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:
> > >> > I produced patched version of HBaseFsck.java which is attached.
> > >> >
> > >> > When I ran it, I got:
> > >> >
> > >> > Version: 0.20.5
> > >> > 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> > >> > /hbase/root-region-server got 10.32.56.159:60020
> > >> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers:
> Found
> > >> ROOT
> > >> > at 10.32.56.159:60020
> > >> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers:
> Cached
> > >> > location for .META.,,1 is 10.32.56.160:60020
> > >> >
> > >> > Number of Tables: 0
> > >> > Number of live region servers:2
> > >> > Number of dead region servers:0
> > >> > Exception in thread "main" java.lang.IllegalArgumentException: Wrong
> > FS:
> > >> > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected:
> file:///
> > >> >         at
> > org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> > >> >         at
> > >> >
> > >>
> >
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
> > >> >         at
> > >> >
> > >>
> >
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
> > >> >         at
> > >> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
> > >> >         at
> > >> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
> > >> >         at
> > >> >
> > >>
> >
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
> > >> >         at
> > >> >
> org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
> > >> >         at
> > >> > org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
> > >> >         at
> > >> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> > >> > 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> > >> > 0x1299926deb30004
> > >> >
> > >> > Please comment.
> > >> >
> > >> > On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
> > >> >>
> > >> >> Hi,
> > >> >> I tried to compile HBaseFsck.java for 0.20.5 but got:
> > >> >>
> > >> >> compile-core:
> > >> >>     [javac] Compiling 338 source files to
> > >> >> /Users/tyu/hbase-0.20.5/build/classes
> > >> >>     [javac]
> > >> >>
> > >>
> >
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
> > >> >> cannot find symbol
> > >> >>     [javac] symbol  : constructor
> > >> >> HBaseAdmin(org.apache.hadoop.conf.Configuration)
> > >> >>     [javac] location: class
> org.apache.hadoop.hbase.client.HBaseAdmin
> > >> >>     [javac]     super(conf);
> > >> >>     [javac]     ^
> > >> >>     [javac]
> > >> >>
> > >>
> >
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
> > >> >> cannot find symbol
> > >> >>     [javac] symbol  : method
> > >> >>
> > >>
> >
> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
> > >> >>     [javac] location: class
> > org.apache.hadoop.hbase.client.MetaScanner
> > >> >>     [javac]       MetaScanner.metaScan(conf, visitor);
> > >> >>     [javac]                  ^
> > >> >>     [javac]
> > >> >>
> > >>
> >
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
> > >> >> cannot find symbol
> > >> >>     [javac] symbol  : method create()
> > >> >>     [javac] location: class
> > org.apache.hadoop.hbase.HBaseConfiguration
> > >> >>     [javac]     Configuration conf = HBaseConfiguration.create();
> > >> >>     [javac]                                            ^
> > >> >>     [javac] Note: Some input files use or override a deprecated
> API.
> > >> >>     [javac] Note: Recompile with -Xlint:deprecation for details.
> > >> >>     [javac] Note: Some input files use unchecked or unsafe
> > operations.
> > >> >>     [javac] Note: Recompile with -Xlint:unchecked for details.
> > >> >>     [javac] 3 errors
> > >> >>
> > >> >> Advice is welcome.
> > >> >
> > >> >
> > >>
> > >
> >
>

Re: compiling HBaseFsck.java for 0.20.5

Posted by Ted Yu <yu...@gmail.com>.
That fixes the issue.

HBaseFsck found missing tables after scanning hdfs (possibly from previous
release of HBase - I installed 0.20.5 recently):
ERROR: Path hdfs://
sjc9-flash-grid04.ciq.com:9000/hbase/TRIAL-ERRORS-1277252980233-0 does not
have a corresponding entry in META.

Is there a way to add those tables back ?

Thanks

On Tue, Jul 6, 2010 at 9:08 AM, Stack <st...@duboce.net> wrote:

> HBaseFsck does this:
>
>    conf.set("fs.defaultFS", conf.get("hbase.rootdir"));
>
> Add this line:
>
>    conf.set("fs.default.name", conf.get("hbase.rootdir"));
>
> See if that fixes it (The former is new way of spec'ing defaultFS
> while latter is oldstyle).
>
> St.Ack
>
> On Mon, Jul 5, 2010 at 6:25 PM, Ted Yu <yu...@gmail.com> wrote:
> > I assume the conf directory is that of HBase.
> >
> > I use this command previously:
> > bin/hbase hbck
> >
> > I tried this today:
> > bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
> >
> > Result is the same.
> >
> > I do see conf in the classpath:
> > 10/07/05 18:12:32 INFO zookeeper.ZooKeeper: Client
> > environment:java.class.path=/home/hadoop/mmp/234_x/hbase/conf:...
> > ...
> > rootDir: hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase hdfs://
> > sjc9-flash-grid04.carrieriq.com:9000/hbase
> > Version: 0.20.5
> > 10/07/05 18:12:32 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> > /hbase/root-region-server got 10.32.56.159:60020
> > 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Found
> ROOT
> > at 10.32.56.159:60020
> > 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Cached
> > location for .META.,,1 is 10.32.56.159:60020
> >
> > Number of Tables: 0
> > Number of live region servers:2
> > Number of dead region servers:0
> > Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
> > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
> >        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> >
> >
> > On Mon, Jul 5, 2010 at 10:24 AM, Stack <st...@duboce.net> wrote:
> >
> >> Make sure conf directory is in your classpath.  If it is, it might the
> >> case that you need something like the below:
> >>
> >> # Set hadoop filesystem configuration using the hbase.rootdir.
> >> # Otherwise, we'll always use localhost though the hbase.rootdir
> >> # might be pointing at hdfs location.
> >> c.set("fs.default.name", c.get(HConstants::HBASE_DIR))
> >> fs = FileSystem.get(c)
> >>
> >> The above is copied from the jruby scripts in the bin dir......
> >>
> >> ...though looking at the HBaseFsck it does this.
> >>
> >> So it must be a case of your not setting up the classpath properly?
> >>
> >> You've set the target hdfs in your hbase-site.xml and then you've
> >> launched the script as per:
> >>
> >> ./bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
> >>
> >> (The above will ensure your classpath is set properly).
> >>
> >> St.Ack
> >>
> >>
> >>
> >> On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:
> >> > I produced patched version of HBaseFsck.java which is attached.
> >> >
> >> > When I ran it, I got:
> >> >
> >> > Version: 0.20.5
> >> > 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> >> > /hbase/root-region-server got 10.32.56.159:60020
> >> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found
> >> ROOT
> >> > at 10.32.56.159:60020
> >> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
> >> > location for .META.,,1 is 10.32.56.160:60020
> >> >
> >> > Number of Tables: 0
> >> > Number of live region servers:2
> >> > Number of dead region servers:0
> >> > Exception in thread "main" java.lang.IllegalArgumentException: Wrong
> FS:
> >> > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
> >> >         at
> org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> >> >         at
> >> >
> >>
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
> >> >         at
> >> >
> >>
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
> >> >         at
> >> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
> >> >         at
> >> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
> >> >         at
> >> >
> >>
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
> >> >         at
> >> > org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
> >> >         at
> >> > org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
> >> >         at
> >> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> >> > 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> >> > 0x1299926deb30004
> >> >
> >> > Please comment.
> >> >
> >> > On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
> >> >>
> >> >> Hi,
> >> >> I tried to compile HBaseFsck.java for 0.20.5 but got:
> >> >>
> >> >> compile-core:
> >> >>     [javac] Compiling 338 source files to
> >> >> /Users/tyu/hbase-0.20.5/build/classes
> >> >>     [javac]
> >> >>
> >>
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
> >> >> cannot find symbol
> >> >>     [javac] symbol  : constructor
> >> >> HBaseAdmin(org.apache.hadoop.conf.Configuration)
> >> >>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
> >> >>     [javac]     super(conf);
> >> >>     [javac]     ^
> >> >>     [javac]
> >> >>
> >>
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
> >> >> cannot find symbol
> >> >>     [javac] symbol  : method
> >> >>
> >>
> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
> >> >>     [javac] location: class
> org.apache.hadoop.hbase.client.MetaScanner
> >> >>     [javac]       MetaScanner.metaScan(conf, visitor);
> >> >>     [javac]                  ^
> >> >>     [javac]
> >> >>
> >>
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
> >> >> cannot find symbol
> >> >>     [javac] symbol  : method create()
> >> >>     [javac] location: class
> org.apache.hadoop.hbase.HBaseConfiguration
> >> >>     [javac]     Configuration conf = HBaseConfiguration.create();
> >> >>     [javac]                                            ^
> >> >>     [javac] Note: Some input files use or override a deprecated API.
> >> >>     [javac] Note: Recompile with -Xlint:deprecation for details.
> >> >>     [javac] Note: Some input files use unchecked or unsafe
> operations.
> >> >>     [javac] Note: Recompile with -Xlint:unchecked for details.
> >> >>     [javac] 3 errors
> >> >>
> >> >> Advice is welcome.
> >> >
> >> >
> >>
> >
>

Re: compiling HBaseFsck.java for 0.20.5

Posted by Stack <st...@duboce.net>.
HBaseFsck does this:

    conf.set("fs.defaultFS", conf.get("hbase.rootdir"));

Add this line:

    conf.set("fs.default.name", conf.get("hbase.rootdir"));

See if that fixes it (The former is new way of spec'ing defaultFS
while latter is oldstyle).

St.Ack

On Mon, Jul 5, 2010 at 6:25 PM, Ted Yu <yu...@gmail.com> wrote:
> I assume the conf directory is that of HBase.
>
> I use this command previously:
> bin/hbase hbck
>
> I tried this today:
> bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
>
> Result is the same.
>
> I do see conf in the classpath:
> 10/07/05 18:12:32 INFO zookeeper.ZooKeeper: Client
> environment:java.class.path=/home/hadoop/mmp/234_x/hbase/conf:...
> ...
> rootDir: hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase hdfs://
> sjc9-flash-grid04.carrieriq.com:9000/hbase
> Version: 0.20.5
> 10/07/05 18:12:32 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> /hbase/root-region-server got 10.32.56.159:60020
> 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Found ROOT
> at 10.32.56.159:60020
> 10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Cached
> location for .META.,,1 is 10.32.56.159:60020
>
> Number of Tables: 0
> Number of live region servers:2
> Number of dead region servers:0
> Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
> hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
>        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
>
>
> On Mon, Jul 5, 2010 at 10:24 AM, Stack <st...@duboce.net> wrote:
>
>> Make sure conf directory is in your classpath.  If it is, it might the
>> case that you need something like the below:
>>
>> # Set hadoop filesystem configuration using the hbase.rootdir.
>> # Otherwise, we'll always use localhost though the hbase.rootdir
>> # might be pointing at hdfs location.
>> c.set("fs.default.name", c.get(HConstants::HBASE_DIR))
>> fs = FileSystem.get(c)
>>
>> The above is copied from the jruby scripts in the bin dir......
>>
>> ...though looking at the HBaseFsck it does this.
>>
>> So it must be a case of your not setting up the classpath properly?
>>
>> You've set the target hdfs in your hbase-site.xml and then you've
>> launched the script as per:
>>
>> ./bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
>>
>> (The above will ensure your classpath is set properly).
>>
>> St.Ack
>>
>>
>>
>> On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:
>> > I produced patched version of HBaseFsck.java which is attached.
>> >
>> > When I ran it, I got:
>> >
>> > Version: 0.20.5
>> > 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
>> > /hbase/root-region-server got 10.32.56.159:60020
>> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found
>> ROOT
>> > at 10.32.56.159:60020
>> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
>> > location for .META.,,1 is 10.32.56.160:60020
>> >
>> > Number of Tables: 0
>> > Number of live region servers:2
>> > Number of dead region servers:0
>> > Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
>> > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
>> >         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
>> >         at
>> >
>> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
>> >         at
>> >
>> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
>> >         at
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
>> >         at
>> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
>> >         at
>> >
>> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
>> >         at
>> > org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
>> >         at
>> > org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
>> >         at
>> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
>> > 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
>> > 0x1299926deb30004
>> >
>> > Please comment.
>> >
>> > On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
>> >>
>> >> Hi,
>> >> I tried to compile HBaseFsck.java for 0.20.5 but got:
>> >>
>> >> compile-core:
>> >>     [javac] Compiling 338 source files to
>> >> /Users/tyu/hbase-0.20.5/build/classes
>> >>     [javac]
>> >>
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
>> >> cannot find symbol
>> >>     [javac] symbol  : constructor
>> >> HBaseAdmin(org.apache.hadoop.conf.Configuration)
>> >>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
>> >>     [javac]     super(conf);
>> >>     [javac]     ^
>> >>     [javac]
>> >>
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
>> >> cannot find symbol
>> >>     [javac] symbol  : method
>> >>
>> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
>> >>     [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
>> >>     [javac]       MetaScanner.metaScan(conf, visitor);
>> >>     [javac]                  ^
>> >>     [javac]
>> >>
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
>> >> cannot find symbol
>> >>     [javac] symbol  : method create()
>> >>     [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
>> >>     [javac]     Configuration conf = HBaseConfiguration.create();
>> >>     [javac]                                            ^
>> >>     [javac] Note: Some input files use or override a deprecated API.
>> >>     [javac] Note: Recompile with -Xlint:deprecation for details.
>> >>     [javac] Note: Some input files use unchecked or unsafe operations.
>> >>     [javac] Note: Recompile with -Xlint:unchecked for details.
>> >>     [javac] 3 errors
>> >>
>> >> Advice is welcome.
>> >
>> >
>>
>

Re: compiling HBaseFsck.java for 0.20.5

Posted by Ted Yu <yu...@gmail.com>.
I assume the conf directory is that of HBase.

I use this command previously:
bin/hbase hbck

I tried this today:
bin/hbase org.apache.hadoop.hbase.client.HBaseFsck

Result is the same.

I do see conf in the classpath:
10/07/05 18:12:32 INFO zookeeper.ZooKeeper: Client
environment:java.class.path=/home/hadoop/mmp/234_x/hbase/conf:...
...
rootDir: hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase hdfs://
sjc9-flash-grid04.carrieriq.com:9000/hbase
Version: 0.20.5
10/07/05 18:12:32 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
/hbase/root-region-server got 10.32.56.159:60020
10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Found ROOT
at 10.32.56.159:60020
10/07/05 18:12:32 DEBUG client.HConnectionManager$TableServers: Cached
location for .META.,,1 is 10.32.56.159:60020

Number of Tables: 0
Number of live region servers:2
Number of dead region servers:0
Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)


On Mon, Jul 5, 2010 at 10:24 AM, Stack <st...@duboce.net> wrote:

> Make sure conf directory is in your classpath.  If it is, it might the
> case that you need something like the below:
>
> # Set hadoop filesystem configuration using the hbase.rootdir.
> # Otherwise, we'll always use localhost though the hbase.rootdir
> # might be pointing at hdfs location.
> c.set("fs.default.name", c.get(HConstants::HBASE_DIR))
> fs = FileSystem.get(c)
>
> The above is copied from the jruby scripts in the bin dir......
>
> ...though looking at the HBaseFsck it does this.
>
> So it must be a case of your not setting up the classpath properly?
>
> You've set the target hdfs in your hbase-site.xml and then you've
> launched the script as per:
>
> ./bin/hbase org.apache.hadoop.hbase.client.HBaseFsck
>
> (The above will ensure your classpath is set properly).
>
> St.Ack
>
>
>
> On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:
> > I produced patched version of HBaseFsck.java which is attached.
> >
> > When I ran it, I got:
> >
> > Version: 0.20.5
> > 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> > /hbase/root-region-server got 10.32.56.159:60020
> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found
> ROOT
> > at 10.32.56.159:60020
> > 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
> > location for .META.,,1 is 10.32.56.160:60020
> >
> > Number of Tables: 0
> > Number of live region servers:2
> > Number of dead region servers:0
> > Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
> > hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
> >         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
> >         at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
> >         at
> >
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
> >         at
> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
> >         at
> org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
> >         at
> >
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
> >         at
> > org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
> >         at
> > org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
> >         at
> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> > 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> > 0x1299926deb30004
> >
> > Please comment.
> >
> > On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
> >>
> >> Hi,
> >> I tried to compile HBaseFsck.java for 0.20.5 but got:
> >>
> >> compile-core:
> >>     [javac] Compiling 338 source files to
> >> /Users/tyu/hbase-0.20.5/build/classes
> >>     [javac]
> >>
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
> >> cannot find symbol
> >>     [javac] symbol  : constructor
> >> HBaseAdmin(org.apache.hadoop.conf.Configuration)
> >>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
> >>     [javac]     super(conf);
> >>     [javac]     ^
> >>     [javac]
> >>
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
> >> cannot find symbol
> >>     [javac] symbol  : method
> >>
> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
> >>     [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
> >>     [javac]       MetaScanner.metaScan(conf, visitor);
> >>     [javac]                  ^
> >>     [javac]
> >>
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
> >> cannot find symbol
> >>     [javac] symbol  : method create()
> >>     [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
> >>     [javac]     Configuration conf = HBaseConfiguration.create();
> >>     [javac]                                            ^
> >>     [javac] Note: Some input files use or override a deprecated API.
> >>     [javac] Note: Recompile with -Xlint:deprecation for details.
> >>     [javac] Note: Some input files use unchecked or unsafe operations.
> >>     [javac] Note: Recompile with -Xlint:unchecked for details.
> >>     [javac] 3 errors
> >>
> >> Advice is welcome.
> >
> >
>

Re: compiling HBaseFsck.java for 0.20.5

Posted by Stack <st...@duboce.net>.
Make sure conf directory is in your classpath.  If it is, it might the
case that you need something like the below:

# Set hadoop filesystem configuration using the hbase.rootdir.
# Otherwise, we'll always use localhost though the hbase.rootdir
# might be pointing at hdfs location.
c.set("fs.default.name", c.get(HConstants::HBASE_DIR))
fs = FileSystem.get(c)

The above is copied from the jruby scripts in the bin dir......

...though looking at the HBaseFsck it does this.

So it must be a case of your not setting up the classpath properly?

You've set the target hdfs in your hbase-site.xml and then you've
launched the script as per:

./bin/hbase org.apache.hadoop.hbase.client.HBaseFsck

(The above will ensure your classpath is set properly).

St.Ack



On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:
> I produced patched version of HBaseFsck.java which is attached.
>
> When I ran it, I got:
>
> Version: 0.20.5
> 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> /hbase/root-region-server got 10.32.56.159:60020
> 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found ROOT
> at 10.32.56.159:60020
> 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
> location for .META.,,1 is 10.32.56.160:60020
>
> Number of Tables: 0
> Number of live region servers:2
> Number of dead region servers:0
> Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
> hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
>         at org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> 0x1299926deb30004
>
> Please comment.
>
> On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
>>
>> Hi,
>> I tried to compile HBaseFsck.java for 0.20.5 but got:
>>
>> compile-core:
>>     [javac] Compiling 338 source files to
>> /Users/tyu/hbase-0.20.5/build/classes
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
>> cannot find symbol
>>     [javac] symbol  : constructor
>> HBaseAdmin(org.apache.hadoop.conf.Configuration)
>>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
>>     [javac]     super(conf);
>>     [javac]     ^
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
>> cannot find symbol
>>     [javac] symbol  : method
>> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
>>     [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
>>     [javac]       MetaScanner.metaScan(conf, visitor);
>>     [javac]                  ^
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
>> cannot find symbol
>>     [javac] symbol  : method create()
>>     [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
>>     [javac]     Configuration conf = HBaseConfiguration.create();
>>     [javac]                                            ^
>>     [javac] Note: Some input files use or override a deprecated API.
>>     [javac] Note: Recompile with -Xlint:deprecation for details.
>>     [javac] Note: Some input files use unchecked or unsafe operations.
>>     [javac] Note: Recompile with -Xlint:unchecked for details.
>>     [javac] 3 errors
>>
>> Advice is welcome.
>
>

Re: compiling HBaseFsck.java for 0.20.5

Posted by yu...@gmail.com.
No.

Sent from my Verizon Wireless BlackBerry

-----Original Message-----
From: Dhruba Borthakur <dh...@gmail.com>
Date: Sun, 4 Jul 2010 23:02:54 
To: <de...@hbase.apache.org>
Reply-To: dev@hbase.apache.org
Subject: Re: compiling HBaseFsck.java for 0.20.5

are you, by any chance, running HBase on a filesystem other than HDFS?

thanks
dhruba

On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:

> I produced patched version of HBaseFsck.java which is attached.
>
> When I ran it, I got:
>
> Version: 0.20.5
> 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> /hbase/root-region-server got 10.32.56.159:60020
> 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found ROOT
> at 10.32.56.159:60020
> 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
> location for .META.,,1 is 10.32.56.160:60020
>
> Number of Tables: 0
> Number of live region servers:2
> Number of dead region servers:0
> Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
> hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> 0x1299926deb30004
>
> Please comment.
>
>
> On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> Hi,
>> I tried to compile HBaseFsck.java for 0.20.5 but got:
>>
>> compile-core:
>>     [javac] Compiling 338 source files to
>> /Users/tyu/hbase-0.20.5/build/classes
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
>> cannot find symbol
>>     [javac] symbol  : constructor
>> HBaseAdmin(org.apache.hadoop.conf.Configuration)
>>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
>>     [javac]     super(conf);
>>     [javac]     ^
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
>> cannot find symbol
>>     [javac] symbol  : method
>> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
>>     [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
>>     [javac]       MetaScanner.metaScan(conf, visitor);
>>     [javac]                  ^
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
>> cannot find symbol
>>     [javac] symbol  : method create()
>>     [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
>>     [javac]     Configuration conf = HBaseConfiguration.create();
>>     [javac]                                            ^
>>     [javac] Note: Some input files use or override a deprecated API.
>>     [javac] Note: Recompile with -Xlint:deprecation for details.
>>     [javac] Note: Some input files use unchecked or unsafe operations.
>>     [javac] Note: Recompile with -Xlint:unchecked for details.
>>     [javac] 3 errors
>>
>> Advice is welcome.
>>
>
>


-- 
Connect to me at http://www.facebook.com/dhruba


Re: compiling HBaseFsck.java for 0.20.5

Posted by Dhruba Borthakur <dh...@gmail.com>.
are you, by any chance, running HBase on a filesystem other than HDFS?

thanks
dhruba

On Sat, Jul 3, 2010 at 9:51 AM, Ted Yu <yu...@gmail.com> wrote:

> I produced patched version of HBaseFsck.java which is attached.
>
> When I ran it, I got:
>
> Version: 0.20.5
> 10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
> /hbase/root-region-server got 10.32.56.159:60020
> 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found ROOT
> at 10.32.56.159:60020
> 10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
> location for .META.,,1 is 10.32.56.160:60020
>
> Number of Tables: 0
> Number of live region servers:2
> Number of dead region servers:0
> Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
> hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
>         at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
>         at
> org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
> 10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
> 0x1299926deb30004
>
> Please comment.
>
>
> On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:
>
>> Hi,
>> I tried to compile HBaseFsck.java for 0.20.5 but got:
>>
>> compile-core:
>>     [javac] Compiling 338 source files to
>> /Users/tyu/hbase-0.20.5/build/classes
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
>> cannot find symbol
>>     [javac] symbol  : constructor
>> HBaseAdmin(org.apache.hadoop.conf.Configuration)
>>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
>>     [javac]     super(conf);
>>     [javac]     ^
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
>> cannot find symbol
>>     [javac] symbol  : method
>> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
>>     [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
>>     [javac]       MetaScanner.metaScan(conf, visitor);
>>     [javac]                  ^
>>     [javac]
>> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
>> cannot find symbol
>>     [javac] symbol  : method create()
>>     [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
>>     [javac]     Configuration conf = HBaseConfiguration.create();
>>     [javac]                                            ^
>>     [javac] Note: Some input files use or override a deprecated API.
>>     [javac] Note: Recompile with -Xlint:deprecation for details.
>>     [javac] Note: Some input files use unchecked or unsafe operations.
>>     [javac] Note: Recompile with -Xlint:unchecked for details.
>>     [javac] 3 errors
>>
>> Advice is welcome.
>>
>
>


-- 
Connect to me at http://www.facebook.com/dhruba

Re: compiling HBaseFsck.java for 0.20.5

Posted by Ted Yu <yu...@gmail.com>.
I produced patched version of HBaseFsck.java which is attached.

When I ran it, I got:

Version: 0.20.5
10/07/03 09:41:38 DEBUG zookeeper.ZooKeeperWrapper: Read ZNode
/hbase/root-region-server got 10.32.56.159:60020
10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Found ROOT
at 10.32.56.159:60020
10/07/03 09:41:38 DEBUG client.HConnectionManager$TableServers: Cached
location for .META.,,1 is 10.32.56.160:60020

Number of Tables: 0
Number of live region servers:2
Number of dead region servers:0
Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS:
hdfs://sjc9-flash-grid04.carrieriq.com:9000/hbase, expected: file:///
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
        at
org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
        at
org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:273)
        at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:721)
        at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:746)
        at
org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:465)
        at
org.apache.hadoop.hbase.client.HBaseFsck.checkHdfs(HBaseFsck.java:192)
        at
org.apache.hadoop.hbase.client.HBaseFsck.doWork(HBaseFsck.java:165)
        at org.apache.hadoop.hbase.client.HBaseFsck.main(HBaseFsck.java:533)
10/07/03 09:41:38 INFO zookeeper.ZooKeeper: Closing session:
0x1299926deb30004

Please comment.

On Sat, Jul 3, 2010 at 7:23 AM, Ted Yu <yu...@gmail.com> wrote:

> Hi,
> I tried to compile HBaseFsck.java for 0.20.5 but got:
>
> compile-core:
>     [javac] Compiling 338 source files to
> /Users/tyu/hbase-0.20.5/build/classes
>     [javac]
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:95:
> cannot find symbol
>     [javac] symbol  : constructor
> HBaseAdmin(org.apache.hadoop.conf.Configuration)
>     [javac] location: class org.apache.hadoop.hbase.client.HBaseAdmin
>     [javac]     super(conf);
>     [javac]     ^
>     [javac]
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:447:
> cannot find symbol
>     [javac] symbol  : method
> metaScan(org.apache.hadoop.conf.Configuration,org.apache.hadoop.hbase.client.MetaScanner.MetaScannerVisitor)
>     [javac] location: class org.apache.hadoop.hbase.client.MetaScanner
>     [javac]       MetaScanner.metaScan(conf, visitor);
>     [javac]                  ^
>     [javac]
> /Users/tyu/hbase-0.20.5/src/java/org/apache/hadoop/hbase/client/HBaseFsck.java:503:
> cannot find symbol
>     [javac] symbol  : method create()
>     [javac] location: class org.apache.hadoop.hbase.HBaseConfiguration
>     [javac]     Configuration conf = HBaseConfiguration.create();
>     [javac]                                            ^
>     [javac] Note: Some input files use or override a deprecated API.
>     [javac] Note: Recompile with -Xlint:deprecation for details.
>     [javac] Note: Some input files use unchecked or unsafe operations.
>     [javac] Note: Recompile with -Xlint:unchecked for details.
>     [javac] 3 errors
>
> Advice is welcome.
>