You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "Hadoop QA (JIRA)" <ji...@apache.org> on 2016/04/11 07:40:25 UTC

[jira] [Commented] (HBASE-15621) Suppress Hbase SnapshotHFile cleaner error messages when a snaphot is going on

    [ https://issues.apache.org/jira/browse/HBASE-15621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15234511#comment-15234511 ] 

Hadoop QA commented on HBASE-15621:
-----------------------------------

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:green}+1{color} | {color:green} hbaseanti {color} | {color:green} 0m 0s {color} | {color:green} Patch does not have any anti-patterns. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s {color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green} 0m 0s {color} | {color:green} The patch appears to include 3 new or modified test files. {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 4m 3s {color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 25s {color} | {color:green} master passed with JDK v1.8.0 {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 49s {color} | {color:green} master passed with JDK v1.7.0_79 {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 5m 13s {color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 23s {color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 2m 54s {color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 3s {color} | {color:green} master passed with JDK v1.8.0 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 0m 57s {color} | {color:green} master passed with JDK v1.7.0_79 {color} |
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 1m 1s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 30s {color} | {color:green} the patch passed with JDK v1.8.0 {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 30s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 0m 50s {color} | {color:green} the patch passed with JDK v1.7.0_79 {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 0m 50s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 7m 8s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvneclipse {color} | {color:green} 0m 30s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s {color} | {color:green} Patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} hadoopcheck {color} | {color:green} 45m 37s {color} | {color:green} Patch does not cause any errors with Hadoop 2.4.0 2.4.1 2.5.0 2.5.1 2.5.2 2.6.1 2.6.2 2.6.3 2.7.1. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green} 4m 1s {color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 22s {color} | {color:green} the patch passed with JDK v1.8.0 {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 8s {color} | {color:green} the patch passed with JDK v1.7.0_79 {color} |
| {color:red}-1{color} | {color:red} unit {color} | {color:red} 220m 7s {color} | {color:red} hbase-server in the patch failed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 29s {color} | {color:green} Patch does not generate ASF License warnings. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 301m 11s {color} | {color:black} {color} |
\\
\\
|| Reason || Tests ||
| Failed junit tests | hadoop.hbase.master.balancer.TestStochasticLoadBalancer2 |
| Timed out junit tests | org.apache.hadoop.hbase.snapshot.TestMobSecureExportSnapshot |
|   | org.apache.hadoop.hbase.snapshot.TestSecureExportSnapshot |
|   | org.apache.hadoop.hbase.security.access.TestNamespaceCommands |
|   | org.apache.hadoop.hbase.snapshot.TestMobExportSnapshot |
|   | org.apache.hadoop.hbase.snapshot.TestExportSnapshot |
|   | org.apache.hadoop.hbase.snapshot.TestMobFlushSnapshotFromClient |
\\
\\
|| Subsystem || Report/Notes ||
| JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12797950/HBASE-15621-v001.patch |
| JIRA Issue | HBASE-15621 |
| Optional Tests |  asflicense  javac  javadoc  unit  findbugs  hadoopcheck  hbaseanti  checkstyle  compile  |
| uname | Linux asf900.gq1.ygridcore.net 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-HBASE-Build/component/dev-support/hbase-personality.sh |
| git revision | master / 80df1cb |
| Default Java | 1.7.0_79 |
| Multi-JDK versions |  /home/jenkins/tools/java/jdk1.8.0:1.8.0 /usr/local/jenkins/java/jdk1.7.0_79:1.7.0_79 |
| findbugs | v3.0.0 |
| unit | https://builds.apache.org/job/PreCommit-HBASE-Build/1352/artifact/patchprocess/patch-unit-hbase-server.txt |
| unit test logs |  https://builds.apache.org/job/PreCommit-HBASE-Build/1352/artifact/patchprocess/patch-unit-hbase-server.txt |
|  Test Results | https://builds.apache.org/job/PreCommit-HBASE-Build/1352/testReport/ |
| modules | C: hbase-server U: hbase-server |
| Console output | https://builds.apache.org/job/PreCommit-HBASE-Build/1352/console |
| Powered by | Apache Yetus 0.2.0   http://yetus.apache.org |


This message was automatically generated.



> Suppress Hbase SnapshotHFile cleaner error  messages when a snaphot is going on
> -------------------------------------------------------------------------------
>
>                 Key: HBASE-15621
>                 URL: https://issues.apache.org/jira/browse/HBASE-15621
>             Project: HBase
>          Issue Type: Bug
>          Components: snapshots
>    Affects Versions: 2.0.0
>            Reporter: huaxiang sun
>            Assignee: huaxiang sun
>            Priority: Minor
>         Attachments: HBASE-15621-v001.patch
>
>
> Run into the following exception when a snapshot is going on.
> partial file of region-manifest and data.manifest could be read and parsed by the cleaner which results in InvalidProtocolBufferException, which needs to be ignored for in-progress snapshot.
> {code}
> 2016-04-01 00:31:50,200 ERROR org.apache.hadoop.hbase.master.snapshot.SnapshotHFileCleaner: Exception while checking if: *** was valid, keeping it just in case.
> com.google.protobuf.InvalidProtocolBufferException: While parsing a protocol message, the input ended unexpectedly in the middle of a field.  This could mean either than the input has been truncated or that an embedded message misreported its own length.
>         at com.google.protobuf.InvalidProtocolBufferException.truncatedMessage(InvalidProtocolBufferException.java:70)
>         at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:746)
>         at com.google.protobuf.CodedInputStream.readRawByte(CodedInputStream.java:769)
>         at com.google.protobuf.CodedInputStream.readRawVarint64(CodedInputStream.java:462)
>         at com.google.protobuf.CodedInputStream.readUInt64(CodedInputStream.java:188)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$StoreFile.<init>(SnapshotProtos.java:1331)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$StoreFile.<init>(SnapshotProtos.java:1263)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$StoreFile$1.parsePartialFrom(SnapshotProtos.java:1364)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$StoreFile$1.parsePartialFrom(SnapshotProtos.java:1359)
>         at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$FamilyFiles.<init>(SnapshotProtos.java:2161)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$FamilyFiles.<init>(SnapshotProtos.java:2103)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$FamilyFiles$1.parsePartialFrom(SnapshotProtos.java:2197)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$FamilyFiles$1.parsePartialFrom(SnapshotProtos.java:2192)
>         at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest.<init>(SnapshotProtos.java:1165)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest.<init>(SnapshotProtos.java:1094)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$1.parsePartialFrom(SnapshotProtos.java:1201)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotRegionManifest$1.parsePartialFrom(SnapshotProtos.java:1196)
>         at com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotDataManifest.<init>(SnapshotProtos.java:3858)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotDataManifest.<init>(SnapshotProtos.java:3792)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotDataManifest$1.parsePartialFrom(SnapshotProtos.java:3894)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotDataManifest$1.parsePartialFrom(SnapshotProtos.java:3889)
>         at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
>         at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
>         at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
>         at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
>         at org.apache.hadoop.hbase.protobuf.generated.SnapshotProtos$SnapshotDataManifest.parseFrom(SnapshotProtos.java:4094)
>         at org.apache.hadoop.hbase.snapshot.SnapshotManifest.readDataManifest(SnapshotManifest.java:433)
>         at org.apache.hadoop.hbase.snapshot.SnapshotManifest.load(SnapshotManifest.java:273)
>         at org.apache.hadoop.hbase.snapshot.SnapshotManifest.open(SnapshotManifest.java:119)
>         at org.apache.hadoop.hbase.snapshot.SnapshotReferenceUtil.visitTableStoreFiles(SnapshotReferenceUtil.java:125)
>         at org.apache.hadoop.hbase.snapshot.SnapshotReferenceUtil.getHFileNames(SnapshotReferenceUtil.java:346)
>         at org.apache.hadoop.hbase.snapshot.SnapshotReferenceUtil.getHFileNames(SnapshotReferenceUtil.java:329)
>         at org.apache.hadoop.hbase.master.snapshot.SnapshotHFileCleaner$1.filesUnderSnapshot(SnapshotHFileCleaner.java:80)
>         at org.apache.hadoop.hbase.master.snapshot.SnapshotFileCache.refreshCache(SnapshotFileCache.java:259)
>         at org.apache.hadoop.hbase.master.snapshot.SnapshotFileCache.contains(SnapshotFileCache.java:179)
>         at org.apache.hadoop.hbase.master.snapshot.SnapshotHFileCleaner.isFileDeletable(SnapshotHFileCleaner.java:60)
>         at org.apache.hadoop.hbase.master.cleaner.BaseFileCleanerDelegate$1.apply(BaseFileCleanerDelegate.java:38)
>         at org.apache.hadoop.hbase.master.cleaner.BaseFileCleanerDelegate$1.apply(BaseFileCleanerDelegate.java:35)
>         at com.google.common.collect.Iterators$8.computeNext(Iterators.java:688)
>         at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>         at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>         at com.google.common.collect.Iterators$8.computeNext(Iterators.java:686)
>         at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143)
>         at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138)
>         at com.google.common.collect.Iterators$6.hasNext(Iterators.java:582)
> {code}
>                                                                                                                        



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)