You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/11/18 23:56:53 UTC

Build failed in Jenkins: Hadoop-Common-trunk #2005

See <https://builds.apache.org/job/Hadoop-Common-trunk/2005/changes>

Changes:

[stevel] YARN-4279. Mark ApplicationId and ApplicationAttemptId static methods as

[wangda] move fix version of YARN-4287 from 2.8.0 to 2.7.3

------------------------------------------
[...truncated 8535 lines...]
  [javadoc] Loading source files for package org.apache.hadoop.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.ganglia...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.jvm...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.spi...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.util...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.annotation...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.filter...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.impl...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.lib...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.sink...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.sink.ganglia...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.source...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.util...
  [javadoc] Loading source files for package org.apache.hadoop.net...
  [javadoc] Loading source files for package org.apache.hadoop.net.unix...
  [javadoc] Loading source files for package org.apache.hadoop.security...
  [javadoc] Loading source files for package org.apache.hadoop.security.alias...
  [javadoc] Loading source files for package org.apache.hadoop.security.authorize...
  [javadoc] Loading source files for package org.apache.hadoop.security.http...
  [javadoc] Loading source files for package org.apache.hadoop.security.protocolPB...
  [javadoc] Loading source files for package org.apache.hadoop.security.ssl...
  [javadoc] Loading source files for package org.apache.hadoop.security.token...
  [javadoc] Loading source files for package org.apache.hadoop.security.token.delegation...
  [javadoc] Loading source files for package org.apache.hadoop.security.token.delegation.web...
  [javadoc] Loading source files for package org.apache.hadoop.service...
  [javadoc] Loading source files for package org.apache.hadoop.tools...
  [javadoc] Loading source files for package org.apache.hadoop.tools.protocolPB...
  [javadoc] Loading source files for package org.apache.hadoop.tracing...
  [javadoc] Loading source files for package org.apache.hadoop.util...
  [javadoc] Loading source files for package org.apache.hadoop.util.bloom...
  [javadoc] Loading source files for package org.apache.hadoop.util.curator...
  [javadoc] Loading source files for package org.apache.hadoop.util.hash...
  [javadoc] Constructing Javadoc information...
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:46: warning: Unsafe is internal proprietary API and may be removed in a future release
  [javadoc] import sun.misc.Unsafe;
  [javadoc]                ^
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:25: warning: Unsafe is internal proprietary API and may be removed in a future release
  [javadoc] import sun.misc.Unsafe;
  [javadoc]                ^
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:54: warning: ResolverConfiguration is internal proprietary API and may be removed in a future release
  [javadoc] import sun.net.dns.ResolverConfiguration;
  [javadoc]                   ^
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:55: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
  [javadoc] import sun.net.util.IPAddressUtil;
  [javadoc]                    ^
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:21: warning: Signal is internal proprietary API and may be removed in a future release
  [javadoc] import sun.misc.Signal;
  [javadoc]                ^
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:22: warning: SignalHandler is internal proprietary API and may be removed in a future release
  [javadoc] import sun.misc.SignalHandler;
  [javadoc]                ^
  [javadoc] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:44: warning: SignalHandler is internal proprietary API and may be removed in a future release
  [javadoc]   private static class Handler implements SignalHandler {
  [javadoc]                                           ^
  [javadoc] ExcludePrivateAnnotationsJDiffDoclet
  [javadoc] JDiff: doclet started ...
  [javadoc] JDiff: reading the old API in from file '<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/dev-support/jdiff/Apache_Hadoop_Common_2.6.0.xml'...Warning>: API identifier in the XML file (hadoop-core 2.6.0) differs from the name of the file 'Apache_Hadoop_Common_2.6.0.xml'
  [javadoc] Warning: incorrectly formatted @link in text: Retrieve the Bloom filter used by this instance of the Reader.
  [javadoc]  @return a Bloom filter (see {@link Filter})
  [javadoc] Warning: incorrectly formatted @link in text: Helper class to shutdown {@link Thread}s and {@link ExecutorService}s.
  [javadoc] Warning: incorrectly formatted @link in text: Constructor
  [javadoc]  @param vectorSize The vector size of <i>this</i> filter.
  [javadoc]  @param nbHash The number of hash function to consider.
  [javadoc]  @param hashType type of the hashing function (see
  [javadoc]  {@link org.apache.hadoop.util.hash.Hash}).
  [javadoc] Warning: incorrectly formatted @link in text: Constructor
  [javadoc]  @param vectorSize The vector size of <i>this</i> filter.
  [javadoc]  @param nbHash The number of hash function to consider.
  [javadoc]  @param hashType type of the hashing function (see
  [javadoc]  {@link org.apache.hadoop.util.hash.Hash}).
  [javadoc] Warning: incorrectly formatted @link in text: Constructor.
  [javadoc]  <p>
  [javadoc]  Builds a hash function that must obey to a given maximum number of returned values and a highest value.
  [javadoc]  @param maxValue The maximum highest returned value.
  [javadoc]  @param nbHash The number of resulting hashed values.
  [javadoc]  @param hashType type of the hashing function (see {@link Hash}).
  [javadoc] Warning: incorrectly formatted @link in text: Constructor
  [javadoc]  @param vectorSize The vector size of <i>this</i> filter.
  [javadoc]  @param nbHash The number of hash function to consider.
  [javadoc]  @param hashType type of the hashing function (see
  [javadoc]  {@link org.apache.hadoop.util.hash.Hash}).
  [javadoc]  finished
  [javadoc] JDiff: reading the new API in from file '<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/site/jdiff/xml/Apache_Hadoop_Common_3.0.0-SNAPSHOT.xml'...Warning>: incorrectly formatted @link in text: Options to be used by the {@link Find} command and its {@link Expression}s.
  [javadoc] Warning: incorrectly formatted @link in text: Retrieve the Bloom filter used by this instance of the Reader.
  [javadoc]  @return a Bloom filter (see {@link Filter})
  [javadoc] Error: duplicate comment id: org.apache.hadoop.metrics2.MetricsSystem.register_changed(java.lang.String, java.lang.String, T)
     [xslt] Processing <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/findbugsXml.xml> to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/site/findbugs.html>
     [xslt] Loading stylesheet /home/jenkins/tools/findbugs/latest/src/xsl/default.xsl
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (pre-dist) @ hadoop-common ---
[INFO] Executing tasks

main:
     [exec] The required option isal.lib isn't given, bundling ISA-L skipped
[INFO] Executed tasks
[INFO] 
[INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-common >>>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-os) @ hadoop-common ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ hadoop-common ---
[INFO] No changes detected in protoc files, skipping generation.
[INFO] 
[INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-common <<<
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-common ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
13 warnings
[WARNING] Javadoc Warnings
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:54: warning: ResolverConfiguration is internal proprietary API and may be removed in a future release
[WARNING] import sun.net.dns.ResolverConfiguration;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:55: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] import sun.net.util.IPAddressUtil;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:21: warning: Signal is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.Signal;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:22: warning: SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.SignalHandler;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:44: warning: SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] private static class Handler implements SignalHandler {
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:46: warning: Unsafe is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.Unsafe;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:25: warning: Unsafe is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.Unsafe;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java>:122: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java>:2163: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java>:2705: warning - @param argument "src" is not a parameter name.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/ftp/FTPFileSystem.java>:184: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/ftp/FTPFileSystem.java>:578: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/file/tfile/TFile.java>:985: warning - @return tag has no arguments.
[INFO] 
[INFO] --- maven-assembly-plugin:2.4:single (dist) @ hadoop-common ---
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/hadoop-common-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-common ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-common ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-common ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-common ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-common ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-common ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
13 warnings
[WARNING] Javadoc Warnings
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:54: warning: ResolverConfiguration is internal proprietary API and may be removed in a future release
[WARNING] import sun.net.dns.ResolverConfiguration;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:55: warning: IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] import sun.net.util.IPAddressUtil;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:21: warning: Signal is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.Signal;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:22: warning: SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.SignalHandler;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:44: warning: SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] private static class Handler implements SignalHandler {
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:46: warning: Unsafe is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.Unsafe;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:25: warning: Unsafe is internal proprietary API and may be removed in a future release
[WARNING] import sun.misc.Unsafe;
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/DelegationTokenRenewer.java>:122: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java>:2163: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileContext.java>:2705: warning - @param argument "src" is not a parameter name.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/ftp/FTPFileSystem.java>:184: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/ftp/FTPFileSystem.java>:578: warning - @return tag has no arguments.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/file/tfile/TFile.java>:985: warning - @return tag has no arguments.
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/hadoop-common-3.0.0-SNAPSHOT-javadoc.jar>
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop NFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-nfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target>
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-nfs ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-nfs ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [  8.920 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [ 22.907 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [05:35 min]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  4.075 s]
[INFO] Apache Hadoop Common .............................. SUCCESS [29:26 min]
[INFO] Apache Hadoop NFS ................................. FAILURE [  0.073 s]
[INFO] Apache Hadoop KMS ................................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 35:39 min
[INFO] Finished at: 2015-11-18T22:56:32+00:00
[INFO] Final Memory: 92M/900M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.15:checkstyle (default-cli) on project hadoop-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location: checkstyle/checkstyle.xml: Could not find resource 'checkstyle/checkstyle.xml'. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-nfs
result: 1

################
# mvn -eaf test -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
################
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Jenkins build is back to normal : Hadoop-Common-trunk #2007

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Common-trunk/2007/changes>


Build failed in Jenkins: Hadoop-Common-trunk #2006

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Common-trunk/2006/changes>

Changes:

[ozawa] HADOOP-12582. Using BytesWritable's getLength() and getBytes() instead

[ozawa] HADOOP-12575. Add build instruction for docker toolbox instead of

------------------------------------------
[...truncated 5610 lines...]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
"Thread-179"  prio=5 tid=404 in Object.wait()
java.lang.Thread.State: WAITING (on object monitor)
        at java.lang.Object.wait(Native Method)
        at java.lang.Thread.join(Thread.java:1281)
        at java.lang.Thread.join(Thread.java:1355)
        at org.apache.hadoop.test.MultithreadedTestUtil$TestContext.stop(MultithreadedTestUtil.java:164)
        at org.apache.hadoop.ha.MiniZKFCCluster.stop(MiniZKFCCluster.java:142)
        at org.apache.hadoop.ha.TestZKFailoverController.testGracefulFailoverMultipleZKfcs(TestZKFailoverController.java:636)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
        at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
        at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
        at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
Tests run: 20, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 78.161 sec <<< FAILURE! - in org.apache.hadoop.ha.TestZKFailoverController
testGracefulFailoverMultipleZKfcs(org.apache.hadoop.ha.TestZKFailoverController)  Time elapsed: 25.123 sec  <<< ERROR!
java.lang.Exception: test timed out after 25000 milliseconds
	at java.lang.Object.wait(Native Method)
	at org.apache.hadoop.ha.ZKFailoverController.waitForActiveAttempt(ZKFailoverController.java:472)
	at org.apache.hadoop.ha.ZKFailoverController.doGracefulFailover(ZKFailoverController.java:679)
	at org.apache.hadoop.ha.ZKFailoverController.access$400(ZKFailoverController.java:62)
	at org.apache.hadoop.ha.ZKFailoverController$3.run(ZKFailoverController.java:607)
	at org.apache.hadoop.ha.ZKFailoverController$3.run(ZKFailoverController.java:604)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669)
	at org.apache.hadoop.ha.ZKFailoverController.gracefulFailoverToYou(ZKFailoverController.java:604)
	at org.apache.hadoop.ha.ZKFCRpcServer.gracefulFailover(ZKFCRpcServer.java:94)
	at org.apache.hadoop.ha.TestZKFailoverController.testGracefulFailoverMultipleZKfcs(TestZKFailoverController.java:620)

        at org.apache.zookeeper.JUnit4ZKTestRunner$LoggedInvokeMethod.evaluate(JUnit4ZKTestRunner.java:
Running org.apache.hadoop.ha.TestZKFailoverControllerStress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 97.78 sec - in org.apache.hadoop.ha.TestZKFailoverControllerStress
Running org.apache.hadoop.ha.TestActiveStandbyElector
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.966 sec - in org.apache.hadoop.ha.TestActiveStandbyElector
Running org.apache.hadoop.ha.TestSshFenceByTcpPort
Tests run: 4, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 4.37 sec - in org.apache.hadoop.ha.TestSshFenceByTcpPort
Running org.apache.hadoop.ha.TestHAAdmin
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.031 sec - in org.apache.hadoop.ha.TestHAAdmin
Running org.apache.hadoop.ha.TestFailoverController
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.213 sec - in org.apache.hadoop.ha.TestFailoverController
Running org.apache.hadoop.ha.TestShellCommandFencer
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.48 sec - in org.apache.hadoop.ha.TestShellCommandFencer
Running org.apache.hadoop.ha.TestNodeFencer
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.011 sec - in org.apache.hadoop.ha.TestNodeFencer
Running org.apache.hadoop.jmx.TestJMXJsonServlet
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.181 sec - in org.apache.hadoop.jmx.TestJMXJsonServlet
Running org.apache.hadoop.conf.TestConfiguration
Tests run: 62, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.408 sec - in org.apache.hadoop.conf.TestConfiguration
Running org.apache.hadoop.conf.TestConfigurationDeprecation
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.808 sec - in org.apache.hadoop.conf.TestConfigurationDeprecation
Running org.apache.hadoop.conf.TestConfServlet
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.538 sec - in org.apache.hadoop.conf.TestConfServlet
Running org.apache.hadoop.conf.TestConfigurationSubclass
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.851 sec - in org.apache.hadoop.conf.TestConfigurationSubclass
Running org.apache.hadoop.conf.TestGetInstances
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.643 sec - in org.apache.hadoop.conf.TestGetInstances
Running org.apache.hadoop.conf.TestReconfiguration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.812 sec - in org.apache.hadoop.conf.TestReconfiguration
Running org.apache.hadoop.conf.TestDeprecatedKeys
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.26 sec - in org.apache.hadoop.conf.TestDeprecatedKeys
Running org.apache.hadoop.log.TestLogLevel
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.451 sec - in org.apache.hadoop.log.TestLogLevel
Running org.apache.hadoop.log.TestLog4Json
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.777 sec - in org.apache.hadoop.log.TestLog4Json
Running org.apache.hadoop.cli.TestCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.239 sec - in org.apache.hadoop.cli.TestCLI
Running org.apache.hadoop.tracing.TestTraceUtils
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.232 sec - in org.apache.hadoop.tracing.TestTraceUtils
Running org.apache.hadoop.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.293 sec - in org.apache.hadoop.util.TestDirectBufferPool
Running org.apache.hadoop.util.TestSysInfoLinux
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.928 sec - in org.apache.hadoop.util.TestSysInfoLinux
Running org.apache.hadoop.util.TestLightWeightGSet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.381 sec - in org.apache.hadoop.util.TestLightWeightGSet
Running org.apache.hadoop.util.TestHostsFileReader
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.326 sec - in org.apache.hadoop.util.TestHostsFileReader
Running org.apache.hadoop.util.hash.TestHash
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.891 sec - in org.apache.hadoop.util.hash.TestHash
Running org.apache.hadoop.util.TestPureJavaCrc32
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.998 sec - in org.apache.hadoop.util.TestPureJavaCrc32
Running org.apache.hadoop.util.TestNativeLibraryChecker
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.017 sec - in org.apache.hadoop.util.TestNativeLibraryChecker
Running org.apache.hadoop.util.TestProtoUtil
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.515 sec - in org.apache.hadoop.util.TestProtoUtil
Running org.apache.hadoop.util.TestStringUtils
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.707 sec - in org.apache.hadoop.util.TestStringUtils
Running org.apache.hadoop.util.TestSignalLogger
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.316 sec - in org.apache.hadoop.util.TestSignalLogger
Running org.apache.hadoop.util.TestStringInterner
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.193 sec - in org.apache.hadoop.util.TestStringInterner
Running org.apache.hadoop.util.TestRunJar
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.973 sec - in org.apache.hadoop.util.TestRunJar
Running org.apache.hadoop.util.TestProgress
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.27 sec - in org.apache.hadoop.util.TestProgress
Running org.apache.hadoop.util.TestReflectionUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.314 sec - in org.apache.hadoop.util.TestReflectionUtils
Running org.apache.hadoop.util.TestVersionUtil
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.426 sec - in org.apache.hadoop.util.TestVersionUtil
Running org.apache.hadoop.util.TestWinUtils
Tests run: 11, Failures: 0, Errors: 0, Skipped: 11, Time elapsed: 0.46 sec - in org.apache.hadoop.util.TestWinUtils
Running org.apache.hadoop.util.TestIndexedSort
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.856 sec - in org.apache.hadoop.util.TestIndexedSort
Running org.apache.hadoop.util.TestGSet
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.861 sec - in org.apache.hadoop.util.TestGSet
Running org.apache.hadoop.util.TestHttpExceptionUtils
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.281 sec - in org.apache.hadoop.util.TestHttpExceptionUtils
Running org.apache.hadoop.util.TestAsyncDiskService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.516 sec - in org.apache.hadoop.util.TestAsyncDiskService
Running org.apache.hadoop.util.TestNodeHealthScriptRunner
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.028 sec - in org.apache.hadoop.util.TestNodeHealthScriptRunner
Running org.apache.hadoop.util.bloom.TestBloomFilters
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.966 sec - in org.apache.hadoop.util.bloom.TestBloomFilters
Running org.apache.hadoop.util.TestShutdownHookManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.28 sec - in org.apache.hadoop.util.TestShutdownHookManager
Running org.apache.hadoop.util.TestStopWatch
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.179 sec - in org.apache.hadoop.util.TestStopWatch
Running org.apache.hadoop.util.curator.TestChildReaper
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.512 sec - in org.apache.hadoop.util.curator.TestChildReaper
Running org.apache.hadoop.util.TestIdentityHashStore
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.337 sec - in org.apache.hadoop.util.TestIdentityHashStore
Running org.apache.hadoop.util.TestLightWeightCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.749 sec - in org.apache.hadoop.util.TestLightWeightCache
Running org.apache.hadoop.util.TestSysInfoWindows
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.348 sec - in org.apache.hadoop.util.TestSysInfoWindows
Running org.apache.hadoop.util.TestDataChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.571 sec - in org.apache.hadoop.util.TestDataChecksum
Running org.apache.hadoop.util.TestLightWeightResizableGSet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.39 sec - in org.apache.hadoop.util.TestLightWeightResizableGSet
Running org.apache.hadoop.util.TestLineReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.46 sec - in org.apache.hadoop.util.TestLineReader
Running org.apache.hadoop.util.TestFindClass
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.457 sec - in org.apache.hadoop.util.TestFindClass
Running org.apache.hadoop.util.TestClasspath
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.455 sec - in org.apache.hadoop.util.TestClasspath
Running org.apache.hadoop.util.TestCacheableIPList
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.531 sec - in org.apache.hadoop.util.TestCacheableIPList
Running org.apache.hadoop.util.TestZKUtil
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.233 sec - in org.apache.hadoop.util.TestZKUtil
Running org.apache.hadoop.util.TestOptions
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.134 sec - in org.apache.hadoop.util.TestOptions
Running org.apache.hadoop.util.TestShell
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.945 sec - in org.apache.hadoop.util.TestShell
Running org.apache.hadoop.util.TestGenericsUtil
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.796 sec - in org.apache.hadoop.util.TestGenericsUtil
Running org.apache.hadoop.util.TestDiskChecker
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.748 sec - in org.apache.hadoop.util.TestDiskChecker
Running org.apache.hadoop.util.TestClassUtil
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.151 sec - in org.apache.hadoop.util.TestClassUtil
Running org.apache.hadoop.util.TestNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.279 sec - in org.apache.hadoop.util.TestNativeCodeLoader
Running org.apache.hadoop.util.TestJarFinder
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.63 sec - in org.apache.hadoop.util.TestJarFinder
Running org.apache.hadoop.util.TestNativeCrc32
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.741 sec - in org.apache.hadoop.util.TestNativeCrc32
Running org.apache.hadoop.util.TestChunkedArrayList
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.35 sec - in org.apache.hadoop.util.TestChunkedArrayList
Running org.apache.hadoop.util.TestMachineList
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.98 sec - in org.apache.hadoop.util.TestMachineList
Running org.apache.hadoop.util.TestShutdownThreadsHelper
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.27 sec - in org.apache.hadoop.util.TestShutdownThreadsHelper
Running org.apache.hadoop.util.TestConfTest
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.603 sec - in org.apache.hadoop.util.TestConfTest
Running org.apache.hadoop.util.TestApplicationClassLoader
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.469 sec - in org.apache.hadoop.util.TestApplicationClassLoader
Running org.apache.hadoop.util.TestFileBasedIPList
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.259 sec - in org.apache.hadoop.util.TestFileBasedIPList
Running org.apache.hadoop.util.TestGenericOptionsParser
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.977 sec - in org.apache.hadoop.util.TestGenericOptionsParser
Running org.apache.hadoop.http.TestGlobalFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.532 sec - in org.apache.hadoop.http.TestGlobalFilter
Running org.apache.hadoop.http.TestPathFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.591 sec - in org.apache.hadoop.http.TestPathFilter
Running org.apache.hadoop.http.TestHttpCookieFlag
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.139 sec - in org.apache.hadoop.http.TestHttpCookieFlag
Running org.apache.hadoop.http.TestServletFilter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.295 sec - in org.apache.hadoop.http.TestServletFilter
Running org.apache.hadoop.http.TestHttpRequestLogAppender
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.134 sec - in org.apache.hadoop.http.TestHttpRequestLogAppender
Running org.apache.hadoop.http.TestHttpServerLifecycle
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.081 sec - in org.apache.hadoop.http.TestHttpServerLifecycle
Running org.apache.hadoop.http.TestAuthenticationSessionCookie
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.967 sec - in org.apache.hadoop.http.TestAuthenticationSessionCookie
Running org.apache.hadoop.http.TestSSLHttpServer
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.544 sec - in org.apache.hadoop.http.TestSSLHttpServer
Running org.apache.hadoop.http.lib.TestStaticUserWebFilter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.176 sec - in org.apache.hadoop.http.lib.TestStaticUserWebFilter
Running org.apache.hadoop.http.TestHttpRequestLog
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.336 sec - in org.apache.hadoop.http.TestHttpRequestLog
Running org.apache.hadoop.http.TestHtmlQuoting
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.687 sec - in org.apache.hadoop.http.TestHtmlQuoting
Running org.apache.hadoop.http.TestHttpServerWebapps
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.367 sec - in org.apache.hadoop.http.TestHttpServerWebapps
Running org.apache.hadoop.http.TestHttpServer
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.013 sec - in org.apache.hadoop.http.TestHttpServer

Results :

Failed tests: 
  TestDecayRpcScheduler.testPriority:199 expected:<0> but was:<1>

Tests in error: 
  TestZKFailoverController.testGracefulFailoverMultipleZKfcs:620->Object.wait:-2 ยป 

Tests run: 3203, Failures: 1, Errors: 1, Skipped: 86

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [ 24.923 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [02:37 min]
[INFO] Apache Hadoop Auth ................................ SUCCESS [18:40 min]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  7.607 s]
[INFO] Apache Hadoop Common .............................. FAILURE [37:26 min]
[INFO] Apache Hadoop NFS ................................. SKIPPED
[INFO] Apache Hadoop KMS ................................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 59:20 min
[INFO] Finished at: 2015-11-19T10:22:37+00:00
[INFO] Final Memory: 77M/916M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-common: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common
result: 1

################
# mvn -eaf test -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
################
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results