You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Robert Dyer <ps...@gmail.com> on 2013/01/28 02:51:38 UTC
Cleaner spamming log file
I noticed my HBase 0.94.3 master log was growing extremely large, then saw
this repeated over and over. Any ideas what is wrong here? Should I just
remove the /hbase/.archive directory for this table?
(note: I changed sensitive host/table names, and the 'checking directory'
line seems to exist once per region, omitting most here)
2013-01-27 19:44:52,337 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/efffadf06a01127d476e378b8c9b73ce
2013-01-27 19:44:52,337 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f013601d853d9dc0c4d8d951843cae7d
2013-01-27 19:44:52,338 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f49e788d91e487ccaa5e772c49f48fd3
2013-01-27 19:44:52,338 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f58e3c8b336b692100a139b5d73f702a
2013-01-27 19:44:52,338 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f6acbb1629c7646ce2d1181e92d2a0ad
2013-01-27 19:44:52,339 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f8b28256870ce1ec011ed679d5348788
2013-01-27 19:44:52,339 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fce83dceecc91452e11e1c75b8454145
2013-01-27 19:44:52,339 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fed2d047b5fb8684db6faa7ccc2be85d
2013-01-27 19:44:52,340 DEBUG
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/ff811b3e22145d22c2341bcaaadb0e4a
2013-01-27 19:44:52,341 WARN
org.apache.hadoop.hbase.master.cleaner.CleanerChore: Error while cleaning
the logs
java.io.IOException: java.io.IOException: /hbase/.archive/XXXTABLEXXX is
non empty
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1972)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:792)
at sun.reflect.GeneratedMethodAccessor488.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
at sun.reflect.GeneratedConstructorAccessor15.newInstance(Unknown
Source)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at
org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96)
at
org.apache.hadoop.hbase.RemoteExceptionHandler.checkThrowable(RemoteExceptionHandler.java:48)
at
org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(RemoteExceptionHandler.java:66)
at
org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:124)
at org.apache.hadoop.hbase.Chore.run(Chore.java:67)
at java.lang.Thread.run(Thread.java:722)
Re: Cleaner spamming log file
Posted by Robert Dyer <rd...@iastate.edu>.
Yes that does indeed look like my bug. I guess I will upgrade to 0.94.4
then. Thanks Jean-Marc and Ted!
On Sun, Jan 27, 2013 at 8:28 PM, Jean-Marc Spaggiari <
jean-marc@spaggiari.org> wrote:
> Indeed, it's related to both ;) You have the exception with is fixed
> by 7467 and the size of the logs which is fixed by 7214.
>
> Also, I will recommand you to disable the DEBUG level for this class
> if you can't migrate to 0.94.4
>
> JM
>
> 2013/1/27, Ted Yu <yu...@gmail.com>:
> > This should have been fixed by HBASE-7214 which is in 0.94.4
> >
> > Cheers
> >
> > On Sun, Jan 27, 2013 at 5:51 PM, Robert Dyer <ps...@gmail.com> wrote:
> >
> >> I noticed my HBase 0.94.3 master log was growing extremely large, then
> >> saw
> >> this repeated over and over. Any ideas what is wrong here? Should I
> >> just
> >> remove the /hbase/.archive directory for this table?
> >>
> >> (note: I changed sensitive host/table names, and the 'checking
> directory'
> >> line seems to exist once per region, omitting most here)
> >>
> >> 2013-01-27 19:44:52,337 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/efffadf06a01127d476e378b8c9b73ce
> >> 2013-01-27 19:44:52,337 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f013601d853d9dc0c4d8d951843cae7d
> >> 2013-01-27 19:44:52,338 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f49e788d91e487ccaa5e772c49f48fd3
> >> 2013-01-27 19:44:52,338 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f58e3c8b336b692100a139b5d73f702a
> >> 2013-01-27 19:44:52,338 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f6acbb1629c7646ce2d1181e92d2a0ad
> >> 2013-01-27 19:44:52,339 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f8b28256870ce1ec011ed679d5348788
> >> 2013-01-27 19:44:52,339 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fce83dceecc91452e11e1c75b8454145
> >> 2013-01-27 19:44:52,339 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fed2d047b5fb8684db6faa7ccc2be85d
> >> 2013-01-27 19:44:52,340 DEBUG
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> >>
> >>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/ff811b3e22145d22c2341bcaaadb0e4a
> >> 2013-01-27 19:44:52,341 WARN
> >> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Error while
> cleaning
> >> the logs
> >> java.io.IOException: java.io.IOException: /hbase/.archive/XXXTABLEXXX is
> >> non empty
> >> at
> >>
> >>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1972)
> >> at
> >>
> org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:792)
> >> at sun.reflect.GeneratedMethodAccessor488.invoke(Unknown Source)
> >> at
> >>
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> at java.lang.reflect.Method.invoke(Method.java:601)
> >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
> >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
> >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at javax.security.auth.Subject.doAs(Subject.java:415)
> >> at
> >>
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
> >>
> >> at
> sun.reflect.GeneratedConstructorAccessor15.newInstance(Unknown
> >> Source)
> >> at
> >>
> >>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >> at
> >> java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> >> at
> >>
> >>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96)
> >> at
> >>
> >>
> org.apache.hadoop.hbase.RemoteExceptionHandler.checkThrowable(RemoteExceptionHandler.java:48)
> >> at
> >>
> >>
> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(RemoteExceptionHandler.java:66)
> >> at
> >>
> >>
> org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:124)
> >> at org.apache.hadoop.hbase.Chore.run(Chore.java:67)
> >> at java.lang.Thread.run(Thread.java:722)
> >>
> >
>
--
Robert Dyer
rdyer@iastate.edu
Re: Cleaner spamming log file
Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Indeed, it's related to both ;) You have the exception with is fixed
by 7467 and the size of the logs which is fixed by 7214.
Also, I will recommand you to disable the DEBUG level for this class
if you can't migrate to 0.94.4
JM
2013/1/27, Ted Yu <yu...@gmail.com>:
> This should have been fixed by HBASE-7214 which is in 0.94.4
>
> Cheers
>
> On Sun, Jan 27, 2013 at 5:51 PM, Robert Dyer <ps...@gmail.com> wrote:
>
>> I noticed my HBase 0.94.3 master log was growing extremely large, then
>> saw
>> this repeated over and over. Any ideas what is wrong here? Should I
>> just
>> remove the /hbase/.archive directory for this table?
>>
>> (note: I changed sensitive host/table names, and the 'checking directory'
>> line seems to exist once per region, omitting most here)
>>
>> 2013-01-27 19:44:52,337 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/efffadf06a01127d476e378b8c9b73ce
>> 2013-01-27 19:44:52,337 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f013601d853d9dc0c4d8d951843cae7d
>> 2013-01-27 19:44:52,338 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f49e788d91e487ccaa5e772c49f48fd3
>> 2013-01-27 19:44:52,338 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f58e3c8b336b692100a139b5d73f702a
>> 2013-01-27 19:44:52,338 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f6acbb1629c7646ce2d1181e92d2a0ad
>> 2013-01-27 19:44:52,339 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f8b28256870ce1ec011ed679d5348788
>> 2013-01-27 19:44:52,339 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fce83dceecc91452e11e1c75b8454145
>> 2013-01-27 19:44:52,339 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fed2d047b5fb8684db6faa7ccc2be85d
>> 2013-01-27 19:44:52,340 DEBUG
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>>
>> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/ff811b3e22145d22c2341bcaaadb0e4a
>> 2013-01-27 19:44:52,341 WARN
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Error while cleaning
>> the logs
>> java.io.IOException: java.io.IOException: /hbase/.archive/XXXTABLEXXX is
>> non empty
>> at
>>
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1972)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:792)
>> at sun.reflect.GeneratedMethodAccessor488.invoke(Unknown Source)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:601)
>> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at
>>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>>
>> at sun.reflect.GeneratedConstructorAccessor15.newInstance(Unknown
>> Source)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>> at
>>
>> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96)
>> at
>>
>> org.apache.hadoop.hbase.RemoteExceptionHandler.checkThrowable(RemoteExceptionHandler.java:48)
>> at
>>
>> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(RemoteExceptionHandler.java:66)
>> at
>>
>> org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:124)
>> at org.apache.hadoop.hbase.Chore.run(Chore.java:67)
>> at java.lang.Thread.run(Thread.java:722)
>>
>
Re: Cleaner spamming log file
Posted by Ted Yu <yu...@gmail.com>.
This should have been fixed by HBASE-7214 which is in 0.94.4
Cheers
On Sun, Jan 27, 2013 at 5:51 PM, Robert Dyer <ps...@gmail.com> wrote:
> I noticed my HBase 0.94.3 master log was growing extremely large, then saw
> this repeated over and over. Any ideas what is wrong here? Should I just
> remove the /hbase/.archive directory for this table?
>
> (note: I changed sensitive host/table names, and the 'checking directory'
> line seems to exist once per region, omitting most here)
>
> 2013-01-27 19:44:52,337 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/efffadf06a01127d476e378b8c9b73ce
> 2013-01-27 19:44:52,337 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f013601d853d9dc0c4d8d951843cae7d
> 2013-01-27 19:44:52,338 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f49e788d91e487ccaa5e772c49f48fd3
> 2013-01-27 19:44:52,338 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f58e3c8b336b692100a139b5d73f702a
> 2013-01-27 19:44:52,338 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f6acbb1629c7646ce2d1181e92d2a0ad
> 2013-01-27 19:44:52,339 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f8b28256870ce1ec011ed679d5348788
> 2013-01-27 19:44:52,339 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fce83dceecc91452e11e1c75b8454145
> 2013-01-27 19:44:52,339 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fed2d047b5fb8684db6faa7ccc2be85d
> 2013-01-27 19:44:52,340 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
>
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/ff811b3e22145d22c2341bcaaadb0e4a
> 2013-01-27 19:44:52,341 WARN
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Error while cleaning
> the logs
> java.io.IOException: java.io.IOException: /hbase/.archive/XXXTABLEXXX is
> non empty
> at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1972)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:792)
> at sun.reflect.GeneratedMethodAccessor488.invoke(Unknown Source)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
> at sun.reflect.GeneratedConstructorAccessor15.newInstance(Unknown
> Source)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> at
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96)
> at
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.checkThrowable(RemoteExceptionHandler.java:48)
> at
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(RemoteExceptionHandler.java:66)
> at
>
> org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:124)
> at org.apache.hadoop.hbase.Chore.run(Chore.java:67)
> at java.lang.Thread.run(Thread.java:722)
>
Re: Cleaner spamming log file
Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Robert,
Take a look at that: https://issues.apache.org/jira/browse/HBASE-7467
Your issue is most probably related to that.
JM
2013/1/27, Robert Dyer <ps...@gmail.com>:
> I noticed my HBase 0.94.3 master log was growing extremely large, then saw
> this repeated over and over. Any ideas what is wrong here? Should I just
> remove the /hbase/.archive directory for this table?
>
> (note: I changed sensitive host/table names, and the 'checking directory'
> line seems to exist once per region, omitting most here)
>
> 2013-01-27 19:44:52,337 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/efffadf06a01127d476e378b8c9b73ce
> 2013-01-27 19:44:52,337 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f013601d853d9dc0c4d8d951843cae7d
> 2013-01-27 19:44:52,338 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f49e788d91e487ccaa5e772c49f48fd3
> 2013-01-27 19:44:52,338 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f58e3c8b336b692100a139b5d73f702a
> 2013-01-27 19:44:52,338 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f6acbb1629c7646ce2d1181e92d2a0ad
> 2013-01-27 19:44:52,339 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/f8b28256870ce1ec011ed679d5348788
> 2013-01-27 19:44:52,339 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fce83dceecc91452e11e1c75b8454145
> 2013-01-27 19:44:52,339 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/fed2d047b5fb8684db6faa7ccc2be85d
> 2013-01-27 19:44:52,340 DEBUG
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Checking directory:
> hdfs://XXXX:8020/hbase/.archive/XXXTABLEXXX/ff811b3e22145d22c2341bcaaadb0e4a
> 2013-01-27 19:44:52,341 WARN
> org.apache.hadoop.hbase.master.cleaner.CleanerChore: Error while cleaning
> the logs
> java.io.IOException: java.io.IOException: /hbase/.archive/XXXTABLEXXX is
> non empty
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1972)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:792)
> at sun.reflect.GeneratedMethodAccessor488.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
> at sun.reflect.GeneratedConstructorAccessor15.newInstance(Unknown
> Source)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> at
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:96)
> at
> org.apache.hadoop.hbase.RemoteExceptionHandler.checkThrowable(RemoteExceptionHandler.java:48)
> at
> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(RemoteExceptionHandler.java:66)
> at
> org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:124)
> at org.apache.hadoop.hbase.Chore.run(Chore.java:67)
> at java.lang.Thread.run(Thread.java:722)
>