You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2013/07/18 15:18:22 UTC

Build failed in Jenkins: Hadoop-Hdfs-trunk #1464

See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1464/changes>

Changes:

[szetszwo] HADOOP-9716. Rpc retries should use the same call ID as the original call.

[cnauroth] HDFS-5003. TestNNThroughputBenchmark failed caused by existing directories. Contributed by Xi Fang.

[jing9] HDFS-5005. Move SnapshotException and SnapshotAccessControlException to o.a.h.hdfs.protocol. Contributed by Jing Zhao.

[hitesh] YARN-865. RM webservices can't query based on application Types. Contributed by Xuan Gong.

[bikas] YARN-922. Change FileSystemRMStateStore to use directories (Jian He via bikas)

[cmccabe] fix misspelling in CHANGES.txt

------------------------------------------
[...truncated 15022 lines...]
	at org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints.testStandbyExceptionThrownDuringCheckpoint(TestStandbyCheckpoints.java:279)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:62)

Running org.apache.hadoop.contrib.bkjournal.TestCurrentInprogress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.691 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperConfiguration
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.096 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperJournalManager
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.411 sec

Results :

Failed tests:   testStandbyExceptionThrownDuringCheckpoint(org.apache.hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints): SBN should have still been checkpointing.

Tests run: 32, Failures: 1, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS-NFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-nfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 12 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.12.3:test (default-test) @ hadoop-hdfs-nfs ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.322 sec
Running org.apache.hadoop.hdfs.nfs.TestMountd
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.299 sec

Results :

Tests run: 8, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (prepare-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:test-jar (prepare-test-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-tests.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-test-sources.jar>
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default) @ hadoop-hdfs-nfs ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is true
[INFO] ****** FindBugsMojo executeFindbugs *******
[INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/findbugsTemp.xml>
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:copy (site) @ hadoop-hdfs-nfs ---
[INFO] Configured Artifact: jdiff:jdiff:1.0.9:jar
[INFO] Configured Artifact: org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT:jar
[INFO] Copying jdiff-1.0.9.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/jdiff.jar>
[INFO] Copying hadoop-annotations-3.0.0-SNAPSHOT.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-annotations.jar>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (site) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (pre-dist) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs-nfs ---
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:test-sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-nfs ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:39:43.710s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:24.135s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [54.486s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [25.396s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.033s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:43:28.624s
[INFO] Finished at: Thu Jul 18 13:17:37 UTC 2013
[INFO] Final Memory: 56M/922M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml>: Could not find resource 'file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'.> -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating HDFS-5005
Updating HADOOP-9716
Updating HDFS-5003
Updating YARN-922
Updating YARN-865

Hadoop-Hdfs-trunk - Build # 1465 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1465/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 15239 lines...]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:38:14.180s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:35.984s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [59.782s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [25.917s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.033s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:42:16.756s
[INFO] Finished at: Fri Jul 19 13:16:30 UTC 2013
[INFO] Final Memory: 47M/804M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file:///home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml: Could not find resource 'file:///home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1468

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1468/changes>

Changes:

[szetszwo] HADOOP-9754. Remove unnecessary "throws IOException/InterruptedException", and fix generic and other javac warnings.

[jing9] HDFS-5018. Misspelled DFSConfigKeys#DFS_NAMENODE_STALE_DATANODE_INTERVAL_DEFAULT in javadoc of DatanodeInfo#isStale(). Contributed by Ted Yu.

[umamahesh] HDFS-4602. TestBookKeeperHACheckpoints fails. Contributed by Uma Maheswara Rao G.

------------------------------------------
[...truncated 11104 lines...]
Running org.apache.hadoop.hdfs.web.TestWebHdfsUrl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.36 sec
Running org.apache.hadoop.hdfs.web.TestJsonUtil
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.171 sec
Running org.apache.hadoop.hdfs.web.TestWebHdfsTimeouts
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.99 sec
Running org.apache.hadoop.hdfs.web.TestAuthFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.406 sec
Running org.apache.hadoop.hdfs.TestConnCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.088 sec
Running org.apache.hadoop.hdfs.TestDFSClientRetries
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 147.485 sec
Running org.apache.hadoop.hdfs.TestListPathServlet
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.37 sec
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 119.977 sec
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.466 sec
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.788 sec
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 83.075 sec
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.513 sec
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.974 sec
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.57 sec
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.063 sec
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.489 sec
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.193 sec
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.113 sec
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.001 sec
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.11 sec
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.929 sec
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.991 sec
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.419 sec
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.689 sec
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.703 sec
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.006 sec
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.945 sec
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.334 sec
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.318 sec
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.636 sec
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.153 sec
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 46.505 sec
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.741 sec
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.684 sec
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.081 sec
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.924 sec
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.251 sec
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.306 sec
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 9.86 sec
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.806 sec
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.011 sec
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.744 sec
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.4 sec
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.859 sec
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.17 sec
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 sec
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.208 sec
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.22 sec
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.097 sec
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.16 sec
Running org.apache.hadoop.hdfs.util.TestGSet
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.671 sec
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.061 sec
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.16 sec
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.34 sec
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.33 sec
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.919 sec
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.012 sec
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.317 sec
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.296 sec
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.473 sec
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.416 sec
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.511 sec
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.924 sec
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.928 sec
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.141 sec
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.422 sec
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.477 sec
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.016 sec
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.575 sec
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.564 sec
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.334 sec
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.299 sec
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.442 sec
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.719 sec

Results :

Tests in error: 
  testInitializeReplQueuesEarly(org.apache.hadoop.hdfs.TestSafeMode): Timed out waiting for condition. Thread diagnostics:(..)

Tests run: 2013, Failures: 0, Errors: 1, Skipped: 38

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:30:36.402s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.769s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:30:39.036s
[INFO] Finished at: Mon Jul 22 13:04:52 UTC 2013
[INFO] Final Memory: 31M/401M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.

Hadoop-Hdfs-trunk - Build # 1468 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1468/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11297 lines...]
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:30:36.402s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.769s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:30:39.036s
[INFO] Finished at: Mon Jul 22 13:04:52 UTC 2013
[INFO] Final Memory: 31M/401M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1467

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1467/changes>

Changes:

[kihwal] HDFS-5010. Reduce the frequency of getCurrentUser() calls from namenode. Contributed by Kihwal Lee.

[acmurthy] YARN-897. Ensure child queues are ordered correctly to account for completed containers. Contributed by Djellel Eddine Difallah.

------------------------------------------
[...truncated 15039 lines...]
java.lang.AssertionError: SBN should have still been checkpointing.
	at org.junit.Assert.fail(Assert.java:93)
	at org.junit.Assert.assertTrue(Assert.java:43)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints.testStandbyExceptionThrownDuringCheckpoint(TestStandbyCheckpoints.java:279)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:62)

Running org.apache.hadoop.contrib.bkjournal.TestCurrentInprogress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.693 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperConfiguration
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.379 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperJournalManager
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.5 sec

Results :

Failed tests:   testStandbyExceptionThrownDuringCheckpoint(org.apache.hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints): SBN should have still been checkpointing.

Tests run: 32, Failures: 1, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS-NFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-nfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 12 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.12.3:test (default-test) @ hadoop-hdfs-nfs ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.322 sec
Running org.apache.hadoop.hdfs.nfs.TestMountd
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.394 sec

Results :

Tests run: 8, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (prepare-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:test-jar (prepare-test-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-tests.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-test-sources.jar>
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default) @ hadoop-hdfs-nfs ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is true
[INFO] ****** FindBugsMojo executeFindbugs *******
[INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/findbugsTemp.xml>
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:copy (site) @ hadoop-hdfs-nfs ---
[INFO] Configured Artifact: jdiff:jdiff:1.0.9:jar
[INFO] Configured Artifact: org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT:jar
[INFO] Copying jdiff-1.0.9.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/jdiff.jar>
[INFO] Copying hadoop-annotations-3.0.0-SNAPSHOT.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-annotations.jar>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (site) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (pre-dist) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs-nfs ---
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:test-sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-nfs ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:39:39.912s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:27.133s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [54.269s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [26.370s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.050s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:43:28.592s
[INFO] Finished at: Sun Jul 21 13:17:38 UTC 2013
[INFO] Final Memory: 47M/779M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml>: Could not find resource 'file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'.> -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.

Hadoop-Hdfs-trunk - Build # 1467 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1467/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 15232 lines...]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:39:39.912s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:27.133s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [54.269s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [26.370s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.050s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:43:28.592s
[INFO] Finished at: Sun Jul 21 13:17:38 UTC 2013
[INFO] Final Memory: 47M/779M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file:///home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml: Could not find resource 'file:///home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1466

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1466/changes>

Changes:

[szetszwo] HADOOP-9751. Add clientId and retryCount to RpcResponseHeaderProto.

[hitesh] YARN-919. Document setting default heap sizes in yarn env.sh Contributed by Mayank Bansal.

[tucu] ADOOP-9643. org.apache.hadoop.security.SecurityUtil calls toUpperCase(Locale.getDefault()) as well as toLowerCase(Locale.getDefault()) on hadoop.security.authentication value. (markrmiller@gmail.com via tucu)

[daryn] HADOOP-9748. Reduce blocking on UGI.ensureInitialized (daryn)

------------------------------------------
[...truncated 15035 lines...]
java.lang.AssertionError: SBN should have still been checkpointing.
	at org.junit.Assert.fail(Assert.java:93)
	at org.junit.Assert.assertTrue(Assert.java:43)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints.testStandbyExceptionThrownDuringCheckpoint(TestStandbyCheckpoints.java:279)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:62)

Running org.apache.hadoop.contrib.bkjournal.TestCurrentInprogress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.694 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperConfiguration
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.101 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperJournalManager
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.463 sec

Results :

Failed tests:   testStandbyExceptionThrownDuringCheckpoint(org.apache.hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints): SBN should have still been checkpointing.

Tests run: 32, Failures: 1, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS-NFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-nfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 12 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.12.3:test (default-test) @ hadoop-hdfs-nfs ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.32 sec
Running org.apache.hadoop.hdfs.nfs.TestMountd
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.34 sec

Results :

Tests run: 8, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (prepare-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:test-jar (prepare-test-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-tests.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-test-sources.jar>
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default) @ hadoop-hdfs-nfs ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is true
[INFO] ****** FindBugsMojo executeFindbugs *******
[INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/findbugsTemp.xml>
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:copy (site) @ hadoop-hdfs-nfs ---
[INFO] Configured Artifact: jdiff:jdiff:1.0.9:jar
[INFO] Configured Artifact: org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT:jar
[INFO] Copying jdiff-1.0.9.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/jdiff.jar>
[INFO] Copying hadoop-annotations-3.0.0-SNAPSHOT.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-annotations.jar>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (site) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (pre-dist) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs-nfs ---
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:test-sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-nfs ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:38:27.292s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:21.618s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [58.321s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [25.697s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.033s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:42:13.807s
[INFO] Finished at: Sat Jul 20 13:16:33 UTC 2013
[INFO] Final Memory: 57M/883M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml>: Could not find resource 'file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'.> -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.

Hadoop-Hdfs-trunk - Build # 1466 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1466/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 15228 lines...]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:38:27.292s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:21.618s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [58.321s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [25.697s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.033s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:42:13.807s
[INFO] Finished at: Sat Jul 20 13:16:33 UTC 2013
[INFO] Final Memory: 57M/883M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file:///home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml: Could not find resource 'file:///home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1465

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1465/changes>

Changes:

[harsh] HDFS-4278. Log an ERROR when DFS_BLOCK_ACCESS_TOKEN_ENABLE config is disabled but security is turned on. Contributed by Kousuke Saruta. (harsh)

[jing9] HDFS-5007. Replace hard-coded property keys with DFSConfigKeys fields. Contributed by Kousuke Saruta.

[acmurthy] Fixed CHANGES.txt to reflect that new content in branch-2 goes to hadoop-2.3.0.

[acmurthy] Fixed CHANGES.txt to reflect that all fixes in 2.1.1-beta are now in 2.1.0-beta.

[acmurthy] YARN-918. Remove ApplicationAttemptId from RegisterApplicationMasterRequestProto. Contributed by Vinod K V.

[vinodkv] YARN-814. Improving diagnostics when containers fail during launch due to various reasons like invalid env etc. Contributed by Jian He.

[jing9] HADOOP-9717. Add retry attempt count to the RPC requests. Contributed by Jing Zhao.

[llu] HADOOP-9164. Print paths of loaded native libraries in NativeLibraryChecker. (Binglin Chang via llu)

[cnauroth] HDFS-4996. ClientProtocol#metaSave can be made idempotent by overwriting the output file instead of appending to it. Contributed by Chris Nauroth.

[jlowe] MAPREDUCE-5265. History server admin service to refresh user and superuser group mappings. Contributed by Ashwin Shankar

[acmurthy] YARN-701. Use application tokens irrespective of secure or non-secure mode. Contributed by Vinod K V.

------------------------------------------
[...truncated 15046 lines...]
java.lang.AssertionError: SBN should have still been checkpointing.
	at org.junit.Assert.fail(Assert.java:93)
	at org.junit.Assert.assertTrue(Assert.java:43)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints.testStandbyExceptionThrownDuringCheckpoint(TestStandbyCheckpoints.java:279)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:45)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:42)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:62)

Running org.apache.hadoop.contrib.bkjournal.TestCurrentInprogress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.693 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperConfiguration
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.104 sec
Running org.apache.hadoop.contrib.bkjournal.TestBookKeeperJournalManager
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.638 sec

Results :

Failed tests:   testStandbyExceptionThrownDuringCheckpoint(org.apache.hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints): SBN should have still been checkpointing.

Tests run: 32, Failures: 1, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS-NFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-nfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 12 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-hdfs-nfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.12.3:test (default-test) @ hadoop-hdfs-nfs ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.057 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.057 sec
Running org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.321 sec
Running org.apache.hadoop.hdfs.nfs.TestMountd
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.384 sec

Results :

Tests run: 8, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (prepare-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:test-jar (prepare-test-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-tests.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] >>> maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-test-sources.jar>
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default) @ hadoop-hdfs-nfs ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is true
[INFO] ****** FindBugsMojo executeFindbugs *******
[INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/findbugsTemp.xml>
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:copy (site) @ hadoop-hdfs-nfs ---
[INFO] Configured Artifact: jdiff:jdiff:1.0.9:jar
[INFO] Configured Artifact: org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT:jar
[INFO] Copying jdiff-1.0.9.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/jdiff.jar>
[INFO] Copying hadoop-annotations-3.0.0-SNAPSHOT.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-annotations.jar>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (site) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (pre-dist) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs <<<
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs-nfs ---
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:test-sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-nfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-nfs ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:38:14.180s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:35.984s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [59.782s]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [25.917s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.033s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:42:16.756s
[INFO] Finished at: Fri Jul 19 13:16:30 UTC 2013
[INFO] Final Memory: 47M/804M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on project hadoop-hdfs-bkjournal: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml>: Could not find resource 'file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'.> -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-bkjournal
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemoteAuthenticationException: Attempt to log in user 'hudson' failed. The maximum number of failed login attempts has been reached. Please log into the application through the web interface to reset the number of failed login attempts.