You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by "Josh Elser (JIRA)" <ji...@apache.org> on 2019/01/07 23:19:00 UTC

[jira] [Resolved] (HBASE-21234) Archive folder not getting cleaned due to SnapshotHFileCleaner error

     [ https://issues.apache.org/jira/browse/HBASE-21234?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Elser resolved HBASE-21234.
--------------------------------
    Resolution: Invalid

> Archive folder not getting cleaned due to SnapshotHFileCleaner error
> --------------------------------------------------------------------
>
>                 Key: HBASE-21234
>                 URL: https://issues.apache.org/jira/browse/HBASE-21234
>             Project: HBase
>          Issue Type: Bug
>          Components: master
>    Affects Versions: 1.1.2
>            Reporter: Abhishek Gupta
>            Priority: Critical
>              Labels: cleanup, snapshot, snapshots
>
> Getting following exception during ChoreService runs in HBase Master logs. As a result we are accumulating a lot of data in archive folder as archive is not getting reclaimed. 
> {code:java}
> Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest.internalGetFieldAccessorTable(SnapshotProtos.java:1190
> {code}
>  Complete stack-trace
> {code:java}
> 2018-09-26 10:15:06,188 ERROR [master01,16000,1536315941769_ChoreService_3] snapshot.SnapshotHFileCleaner: Exception while checking if files were valid, keeping them just in case. java.io.IOException: ExecutionException at org.apache.hadoop.hbase.snapshot.SnapshotManifestV2.loadRegionManifests(SnapshotManifestV2.java:161) at org.apache.hadoop.hbase.snapshot.SnapshotManifest.load(SnapshotManifest.java:364) at org.apache.hadoop.hbase.snapshot.SnapshotManifest.open(SnapshotManifest.java:130) at org.apache.hadoop.hbase.snapshot.SnapshotReferenceUtil.visitTableStoreFiles(SnapshotReferenceUtil.java:128) at org.apache.hadoop.hbase.snapshot.SnapshotReferenceUtil.getHFileNames(SnapshotReferenceUtil.java:357) at org.apache.hadoop.hbase.snapshot.SnapshotReferenceUtil.getHFileNames(SnapshotReferenceUtil.java:340) at org.apache.hadoop.hbase.master.snapshot.SnapshotHFileCleaner$1.filesUnderSnapshot(SnapshotHFileCleaner.java:88) at org.apache.hadoop.hbase.master.snapshot.SnapshotFileCache.getSnapshotsInProgress(SnapshotFileCache.java:303) at org.apache.hadoop.hbase.master.snapshot.SnapshotFileCache.getUnreferencedFiles(SnapshotFileCache.java:194) at org.apache.hadoop.hbase.master.snapshot.SnapshotHFileCleaner.getDeletableFiles(SnapshotHFileCleaner.java:63) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteFiles(CleanerChore.java:287) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteEntries(CleanerChore.java:211) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteDirectory(CleanerChore.java:234) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteEntries(CleanerChore.java:206) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteDirectory(CleanerChore.java:234) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteEntries(CleanerChore.java:206) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteDirectory(CleanerChore.java:234) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteEntries(CleanerChore.java:206) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteDirectory(CleanerChore.java:234) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteEntries(CleanerChore.java:206) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteDirectory(CleanerChore.java:234) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.checkAndDeleteEntries(CleanerChore.java:206) at org.apache.hadoop.hbase.master.cleaner.CleanerChore.chore(CleanerChore.java:130) at org.apache.hadoop.hbase.ScheduledChore.run(ScheduledChore.java:185) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest.internalGetFieldAccessorTable(SnapshotProtos.java:1190) at com.google.protobuf.GeneratedMessage.getDescriptorForType(GeneratedMessage.java:98) at com.google.protobuf.AbstractMessage$Builder.findMissingFields(AbstractMessage.java:789) at com.google.protobuf.AbstractMessage$Builder.findMissingFields(AbstractMessage.java:780) at com.google.protobuf.AbstractMessage$Builder.newUninitializedMessageException(AbstractMessage.java:770) at com.google.protobuf.AbstractMessage.newUninitializedMessageException(AbstractMessage.java:237) at com.google.protobuf.AbstractParser.newUninitializedMessageException(AbstractParser.java:57) at com.google.protobuf.AbstractParser.checkMessageInitialized(AbstractParser.java:71) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)