You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/05/22 13:00:34 UTC

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1248

See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1248/changes>

Changes:

[iwasakims] HDFS-10439. Update setOwner doc in HdfsPermissionsGuide. Contributed by

------------------------------------------
[...truncated 5240 lines...]
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.723 sec - in org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Running org.apache.hadoop.hdfs.TestReconstructStripedFile
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStream
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.573 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure050
Running org.apache.hadoop.hdfs.TestDFSInputStream
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.106 sec - in org.apache.hadoop.hdfs.TestDFSInputStream
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.958 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestFileAppend4
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.85 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStream
Running org.apache.hadoop.hdfs.TestParallelRead
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 108.997 sec - in org.apache.hadoop.hdfs.TestReconstructStripedFile
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.697 sec - in org.apache.hadoop.hdfs.TestFileAppend4
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.887 sec - in org.apache.hadoop.hdfs.TestParallelRead
Running org.apache.hadoop.hdfs.TestClose
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure040
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 153.554 sec - in org.apache.hadoop.hdfs.TestWriteReadStripedFile
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.424 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Running org.apache.hadoop.hdfs.TestParallelShortCircuitLegacyRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.388 sec - in org.apache.hadoop.hdfs.TestClose
Running org.apache.hadoop.hdfs.TestLargeBlock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.461 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.606 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.175 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitLegacyRead
Running org.apache.hadoop.hdfs.TestWriteRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.717 sec - in org.apache.hadoop.hdfs.TestLargeBlock
Running org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.104 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestBalancerBandwidth
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.323 sec - in org.apache.hadoop.hdfs.TestBalancerBandwidth
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.252 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure040
Running org.apache.hadoop.hdfs.TestBlockStoragePolicy
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.998 sec - in org.apache.hadoop.hdfs.TestWriteRead
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.52 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.094 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestLeaseRecoveryStriped
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocal
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.429 sec - in org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocalLegacy
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.192 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocalLegacy
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote2
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.941 sec - in org.apache.hadoop.hdfs.TestBlockStoragePolicy
Running org.apache.hadoop.hdfs.client.impl.TestClientBlockVerification
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.417 sec - in org.apache.hadoop.hdfs.TestLeaseRecoveryStriped
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.658 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote2
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderFactory
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.288 sec - in org.apache.hadoop.hdfs.client.impl.TestClientBlockVerification
Running org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.67 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote
Running org.apache.hadoop.hdfs.qjournal.TestNNWithQJM
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.171 sec - in org.apache.hadoop.hdfs.qjournal.TestNNWithQJM
Running org.apache.hadoop.hdfs.qjournal.server.TestJournalNodeMXBean
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.367 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderFactory
Running org.apache.hadoop.hdfs.qjournal.server.TestJournal
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.521 sec - in org.apache.hadoop.hdfs.qjournal.server.TestJournalNodeMXBean
Running org.apache.hadoop.hdfs.qjournal.server.TestJournalNode
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.886 sec - in org.apache.hadoop.hdfs.qjournal.server.TestJournal
Running org.apache.hadoop.hdfs.qjournal.client.TestIPCLoggerChannel
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.929 sec - in org.apache.hadoop.hdfs.qjournal.client.TestIPCLoggerChannel
Running org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 68.357 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.qjournal.client.TestEpochsAreUnique
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.189 sec - in org.apache.hadoop.hdfs.qjournal.client.TestEpochsAreUnique
Running org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManagerUnit
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.451 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManagerUnit
Running org.apache.hadoop.hdfs.qjournal.client.TestSegmentRecoveryComparator
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.23 sec - in org.apache.hadoop.hdfs.qjournal.client.TestSegmentRecoveryComparator
Running org.apache.hadoop.hdfs.qjournal.client.TestQuorumCall
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.241 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQuorumCall
Running org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManager
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.309 sec - in org.apache.hadoop.hdfs.qjournal.server.TestJournalNode
Running org.apache.hadoop.hdfs.qjournal.TestMiniJournalCluster
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.892 sec - in org.apache.hadoop.hdfs.qjournal.TestMiniJournalCluster
Running org.apache.hadoop.hdfs.TestAsyncDFSRename
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.618 sec - in org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.273 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.326 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManager
Running org.apache.hadoop.hdfs.TestBlockMissingException
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.785 sec - in org.apache.hadoop.hdfs.TestBlockMissingException
Running org.apache.hadoop.hdfs.TestDataTransferKeepalive
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.79 sec - in org.apache.hadoop.hdfs.TestDataTransferKeepalive
Running org.apache.hadoop.hdfs.TestModTime
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.136 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure210
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.143 sec - in org.apache.hadoop.hdfs.TestModTime
Running org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.875 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade
Running org.apache.hadoop.hdfs.TestLease
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.463 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.247 sec - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Running org.apache.hadoop.hdfs.TestDeprecatedKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.49 sec - in org.apache.hadoop.hdfs.TestDeprecatedKeys
Running org.apache.hadoop.hdfs.TestFSOutputSummer
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.083 sec - in org.apache.hadoop.hdfs.TestFSOutputSummer
Running org.apache.hadoop.hdfs.TestRead
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.119 sec - in org.apache.hadoop.hdfs.TestRead
Running org.apache.hadoop.hdfs.TestFileStatusWithECPolicy
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.882 sec - in org.apache.hadoop.hdfs.TestFileStatusWithECPolicy
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 109.001 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure210
Running org.apache.hadoop.hdfs.TestHDFSPolicyProvider
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.203 sec - in org.apache.hadoop.hdfs.TestHDFSPolicyProvider
Running org.apache.hadoop.hdfs.TestAppendSnapshotTruncate
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 212.713 sec - in org.apache.hadoop.hdfs.TestAsyncDFSRename
Running org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.732 sec - in org.apache.hadoop.hdfs.TestAppendSnapshotTruncate
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.351 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.228 sec - in org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Running org.apache.hadoop.hdfs.TestGetBlocks
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.753 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.965 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestDecommissionWithStriped
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.384 sec - in org.apache.hadoop.hdfs.TestGetBlocks
Running org.apache.hadoop.hdfs.TestDataTransferProtocol
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.116 sec - in org.apache.hadoop.hdfs.TestDataTransferProtocol
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.186 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.098 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestKeyProviderCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.49 sec - in org.apache.hadoop.hdfs.TestKeyProviderCache
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.767 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.375 sec - in org.apache.hadoop.hdfs.TestDecommissionWithStriped
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.624 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.549 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.852 sec - in org.apache.hadoop.tracing.TestTracing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 352.986 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.72 sec - in org.apache.hadoop.TestRefreshCallQueue
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.453 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.91 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 124.315 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure

Results :

Failed tests: 
  TestShortCircuitCache.testDataXceiverCleansUpSlotsOnFailure:682->checkNumberOfSegmentsAndSlots:628 expected:<1> but was:<2>

Tests run: 4427, Failures: 1, Errors: 0, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:03 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:10 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.098 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:14 h
[INFO] Finished at: 2016-05-22T13:00:10+00:00
[INFO] Final Memory: 96M/3727M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Hadoop-Hdfs-trunk-Java8 - Build # 1249 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1249/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 5444 lines...]
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [03:57 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [57:08 min]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.077 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:01 h
[INFO] Finished at: 2016-05-22T23:46:45+00:00
[INFO] Final Memory: 98M/3679M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage

Error Message:
Cannot obtain block length for LocatedBlock{BP-115903912-67.195.81.145-1463960174070:blk_7162739548153522810_1020; getBlockSize()=1024; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:36608,DS-f4180b96-3a52-441c-b654-4a8a3c10fcc5,DISK]]}

Stack Trace:
java.io.IOException: Cannot obtain block length for LocatedBlock{BP-115903912-67.195.81.145-1463960174070:blk_7162739548153522810_1020; getBlockSize()=1024; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:36608,DS-f4180b96-3a52-441c-b654-4a8a3c10fcc5,DISK]]}
	at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:435)
	at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:345)
	at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:278)
	at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:267)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1038)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1003)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.dfsOpenFileWithRetries(TestDFSUpgradeFromImage.java:178)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.verifyDir(TestDFSUpgradeFromImage.java:214)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.verifyFileSystem(TestDFSUpgradeFromImage.java:229)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.upgradeAndVerify(TestDFSUpgradeFromImage.java:606)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage(TestDFSUpgradeFromImage.java:628)


FAILED:  org.apache.hadoop.hdfs.server.datanode.TestLargeBlockReport.testBlockReportSucceedsWithLargerLengthLimit

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.hdfs.server.datanode.TestLargeBlockReport.testBlockReportSucceedsWithLargerLengthLimit(TestLargeBlockReport.java:97)




Hadoop-Hdfs-trunk-Java8 - Build # 1250 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1250/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 5463 lines...]
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [03:58 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [57:26 min]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.104 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:01 h
[INFO] Finished at: 2016-05-23T00:50:24+00:00
[INFO] Final Memory: 96M/3607M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.server.datanode.TestDataNodeMultipleRegistrations.testDNWithInvalidStorageWithHA

Error Message:
Problem binding to [localhost:40602] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException

Stack Trace:
java.net.BindException: Problem binding to [localhost:40602] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.apache.hadoop.ipc.Server.bind(Server.java:530)
	at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:793)
	at org.apache.hadoop.ipc.Server.<init>(Server.java:2592)
	at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:958)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:559)
	at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:534)
	at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:800)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.<init>(NameNodeRpcServer.java:438)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:787)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:714)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:928)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:907)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1624)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.java:2038)
	at org.apache.hadoop.hdfs.server.datanode.TestDataNodeMultipleRegistrations.testDNWithInvalidStorageWithHA(TestDataNodeMultipleRegistrations.java:293)


FAILED:  org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage

Error Message:
Cannot obtain block length for LocatedBlock{BP-75433594-67.195.81.148-1463961857919:blk_7162739548153522810_1020; getBlockSize()=1024; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:42478,DS-cfa89936-1995-4821-a503-5738bcdd581a,DISK]]}

Stack Trace:
java.io.IOException: Cannot obtain block length for LocatedBlock{BP-75433594-67.195.81.148-1463961857919:blk_7162739548153522810_1020; getBlockSize()=1024; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:42478,DS-cfa89936-1995-4821-a503-5738bcdd581a,DISK]]}
	at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:435)
	at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:345)
	at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:278)
	at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:267)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1038)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1003)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.dfsOpenFileWithRetries(TestDFSUpgradeFromImage.java:178)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.verifyDir(TestDFSUpgradeFromImage.java:214)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.verifyFileSystem(TestDFSUpgradeFromImage.java:229)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.upgradeAndVerify(TestDFSUpgradeFromImage.java:606)
	at org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage(TestDFSUpgradeFromImage.java:628)




Hadoop-Hdfs-trunk-Java8 - Build # 1252 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1252/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 5433 lines...]
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:06 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:11 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.079 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:15 h
[INFO] Finished at: 2016-05-23T18:23:23+00:00
[INFO] Final Memory: 96M/3577M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.hdfs.server.namenode.TestEditLog.testBatchedSyncWithClosedLogs[1]

Error Message:
logging edit without syncing should do not affect txid expected:<1> but was:<2>

Stack Trace:
java.lang.AssertionError: logging edit without syncing should do not affect txid expected:<1> but was:<2>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.hdfs.server.namenode.TestEditLog.testBatchedSyncWithClosedLogs(TestEditLog.java:594)




Jenkins build is back to normal : Hadoop-Hdfs-trunk-Java8 #1255

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1255/changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1254

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1254/changes>

Changes:

[kasha] YARN-4979. FSAppAttempt demand calculation considers demands at multiple

------------------------------------------
[...truncated 5254 lines...]
ERROR: Could not install MAVEN_3_3_3_HOME
java.lang.NullPointerException
	at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
	at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:947)
	at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:390)
	at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:577)
	at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:527)
	at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:381)
	at hudson.scm.SCM.poll(SCM.java:398)
	at hudson.model.AbstractProject._poll(AbstractProject.java:1453)
	at hudson.model.AbstractProject.poll(AbstractProject.java:1356)
	at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:526)
	at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:555)
	at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.89 sec - in org.apache.hadoop.hdfs.TestDistributedFileSystem
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.363 sec - in org.apache.hadoop.hdfs.crypto.TestHdfsCryptoStreams
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.951 sec - in org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Running org.apache.hadoop.hdfs.TestAclsEndToEnd
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.367 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.658 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.451 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Running org.apache.hadoop.hdfs.TestFileStatus
Running org.apache.hadoop.hdfs.TestAsyncDFSRename
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.946 sec - in org.apache.hadoop.hdfs.TestFileStatus
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.909 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.695 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestPread
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.89 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.164 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.58 sec - in org.apache.hadoop.hdfs.TestAclsEndToEnd
Running org.apache.hadoop.hdfs.TestGetFileChecksum
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.357 sec - in org.apache.hadoop.hdfs.TestGetFileChecksum
Running org.apache.hadoop.hdfs.TestLeaseRecovery2
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.112 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.357 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000
Running org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.284 sec - in org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Running org.apache.hadoop.hdfs.TestRollingUpgrade
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 101.888 sec - in org.apache.hadoop.hdfs.TestPread
Running org.apache.hadoop.hdfs.TestReservedRawPaths
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.475 sec - in org.apache.hadoop.hdfs.TestReservedRawPaths
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.802 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.076 sec - in org.apache.hadoop.hdfs.TestLeaseRecovery2
Running org.apache.hadoop.hdfs.tools.TestDFSHAAdminMiniCluster
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.828 sec - in org.apache.hadoop.hdfs.tools.TestDFSHAAdminMiniCluster
Running org.apache.hadoop.hdfs.tools.TestGetConf
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.287 sec - in org.apache.hadoop.hdfs.tools.TestGetConf
Running org.apache.hadoop.hdfs.tools.TestStoragePolicyCommands
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.096 sec - in org.apache.hadoop.hdfs.tools.TestStoragePolicyCommands
Running org.apache.hadoop.hdfs.tools.TestDFSZKFailoverController
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.048 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.tools.TestDFSAdmin
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.496 sec - in org.apache.hadoop.hdfs.tools.TestDFSAdmin
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForXAttr
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.091 sec - in org.apache.hadoop.hdfs.tools.TestDFSZKFailoverController
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerWithStripedBlocks
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.4 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForXAttr
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.612 sec - in org.apache.hadoop.hdfs.TestRollingUpgrade
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForContentSummary
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.828 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerWithStripedBlocks
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForAcl
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.156 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer
Running org.apache.hadoop.hdfs.tools.TestDFSHAAdmin
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.294 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForContentSummary
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.864 sec - in org.apache.hadoop.hdfs.tools.TestDFSHAAdmin
Running org.apache.hadoop.hdfs.tools.TestDelegationTokenFetcher
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.021 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForAcl
Running org.apache.hadoop.hdfs.tools.TestDebugAdmin
Running org.apache.hadoop.hdfs.tools.TestGetGroups
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 201.054 sec - in org.apache.hadoop.hdfs.TestAsyncDFSRename
Running org.apache.hadoop.hdfs.tools.TestDFSAdminWithHA
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.901 sec - in org.apache.hadoop.hdfs.tools.TestDelegationTokenFetcher
Running org.apache.hadoop.hdfs.tools.offlineEditsViewer.TestOfflineEditsViewer
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.932 sec - in org.apache.hadoop.hdfs.tools.TestGetGroups
Running org.apache.hadoop.hdfs.TestHDFSFileSystemContract
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.328 sec - in org.apache.hadoop.hdfs.tools.TestDebugAdmin
Running org.apache.hadoop.hdfs.TestClose
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.72 sec - in org.apache.hadoop.hdfs.TestClose
Running org.apache.hadoop.hdfs.TestFetchImage
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.939 sec - in org.apache.hadoop.hdfs.TestFetchImage
Running org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.008 sec - in org.apache.hadoop.hdfs.tools.TestDFSAdminWithHA
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.908 sec - in org.apache.hadoop.hdfs.tools.offlineEditsViewer.TestOfflineEditsViewer
Running org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.714 sec - in org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Running org.apache.hadoop.hdfs.TestFileAppend
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.889 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.992 sec - in org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
Running org.apache.hadoop.hdfs.TestLease
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.333 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.292 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.TestGenericRefresh
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.444 sec - in org.apache.hadoop.TestGenericRefresh
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.294 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.273 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.077 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.167 sec - in org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.164 sec - in org.apache.hadoop.hdfs.TestFileAppend
Running org.apache.hadoop.cli.TestDeleteCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.468 sec - in org.apache.hadoop.cli.TestDeleteCLI
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.545 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Running org.apache.hadoop.cli.TestAclCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.574 sec - in org.apache.hadoop.cli.TestAclCLI
Running org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.395 sec - in org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.459 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Running org.apache.hadoop.cli.TestCacheAdminCLI
Running org.apache.hadoop.cli.TestXAttrCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.532 sec - in org.apache.hadoop.cli.TestCacheAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.644 sec - in org.apache.hadoop.cli.TestXAttrCLI
Running org.apache.hadoop.cli.TestCryptoAdminCLI
Running org.apache.hadoop.security.TestPermission
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.051 sec - in org.apache.hadoop.security.TestPermission
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.682 sec - in org.apache.hadoop.cli.TestCryptoAdminCLI
Running org.apache.hadoop.security.TestRefreshUserMappings
Running org.apache.hadoop.security.TestPermissionSymlinks
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.24 sec - in org.apache.hadoop.security.TestRefreshUserMappings
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.347 sec - in org.apache.hadoop.security.TestPermissionSymlinks
Running org.apache.hadoop.tools.TestTools
Running org.apache.hadoop.tools.TestHdfsConfigFields
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.761 sec - in org.apache.hadoop.tools.TestHdfsConfigFields
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.453 sec - in org.apache.hadoop.tools.TestTools
Running org.apache.hadoop.tools.TestJMXGet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.03 sec - in org.apache.hadoop.tools.TestJMXGet
Tests run: 44, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 143.024 sec - in org.apache.hadoop.hdfs.TestHDFSFileSystemContract
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.954 sec - in org.apache.hadoop.cli.TestHDFSCLI

Results :

Tests in error: 
  TestBlockStatsMXBean.testStorageTypeStatsWhenStorageFailed:193 » Bind Problem ...

Tests run: 4427, Failures: 0, Errors: 1, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:01 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [59:40 min]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.128 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:03 h
[INFO] Finished at: 2016-05-23T23:49:52+00:00
[INFO] Final Memory: 97M/4447M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Hadoop-Hdfs-trunk-Java8 - Build # 1254 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1254/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 5451 lines...]
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:01 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [59:40 min]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.128 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:03 h
[INFO] Finished at: 2016-05-23T23:49:52+00:00
[INFO] Final Memory: 97M/4447M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean.testStorageTypeStatsWhenStorageFailed

Error Message:
Problem binding to [localhost:36413] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException

Stack Trace:
java.net.BindException: Problem binding to [localhost:36413] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.apache.hadoop.ipc.Server.bind(Server.java:530)
	at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:793)
	at org.apache.hadoop.ipc.Server.<init>(Server.java:2592)
	at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:958)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:559)
	at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:534)
	at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:800)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.initIpcServer(DataNode.java:932)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1297)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:479)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2585)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2473)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2520)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartDataNode(MiniDFSCluster.java:2260)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartDataNode(MiniDFSCluster.java:2299)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartDataNode(MiniDFSCluster.java:2279)
	at org.apache.hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean.testStorageTypeStatsWhenStorageFailed(TestBlockStatsMXBean.java:193)




Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1253

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1253/changes>

Changes:

[aw] HADOOP-13112. Change CredentialShell to use CommandShell base class

------------------------------------------
[...truncated 7331 lines...]
Running org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.77 sec - in org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 126.852 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.511 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.TestGetBlocks
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.315 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Tests run: 7, Failures: 1, Errors: 3, Skipped: 0, Time elapsed: 261.989 sec <<< FAILURE! - in org.apache.hadoop.hdfs.TestAsyncDFSRename
testAggressiveConcurrentAsyncAPI(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time elapsed: 60.012 sec  <<< ERROR!
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Throwable.getStackTraceElement(Native Method)
	at java.lang.Throwable.getOurStackTrace(Throwable.java:827)
	at java.lang.Throwable.getStackTrace(Throwable.java:816)
	at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.log4j.spi.LocationInfo.<init>(LocationInfo.java:139)
	at org.apache.log4j.spi.LoggingEvent.getLocationInformation(LoggingEvent.java:253)
	at org.apache.log4j.helpers.PatternParser$LocationPatternConverter.convert(PatternParser.java:500)
	at org.apache.log4j.helpers.PatternConverter.format(PatternConverter.java:65)
	at org.apache.log4j.PatternLayout.format(PatternLayout.java:506)
	at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310)
	at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
	at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
	at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
	at org.apache.log4j.Category.callAppenders(Category.java:206)
	at org.apache.log4j.Category.forcedLog(Category.java:391)
	at org.apache.log4j.Category.log(Category.java:856)
	at org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:176)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shouldWait(MiniDFSCluster.java:2568)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2487)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2530)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNodes(MiniDFSCluster.java:1995)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncAPI(TestAsyncDFSRename.java:428)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI(TestAsyncDFSRename.java:289)

testAggressiveConcurrentAsyncRenameWithOverwrite(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time elapsed: 60.001 sec  <<< ERROR!
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:825)
	at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:784)
	at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:755)
	at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
	at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:101)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:417)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:375)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:368)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:361)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:226)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:199)

testCallGetReturnValueMultipleTimes(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time elapsed: 1.105 sec  <<< ERROR!
java.io.IOException: Cannot remove data directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/datapath> '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source>
	permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/'>: 
	absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/>
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
	absolute:/home/jenkins/jenkins-slave/workspace
	permissions: drwx
path '/home/jenkins/jenkins-slave': 
	absolute:/home/jenkins/jenkins-slave
	permissions: drwx
path '/home/jenkins': 
	absolute:/home/jenkins
	permissions: drwx
path '/home': 
	absolute:/home
	permissions: dr-x
path '/': 
	absolute:/
	permissions: dr-x

	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:490)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:449)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testCallGetReturnValueMultipleTimes(TestAsyncDFSRename.java:134)

testConservativeConcurrentAsyncRenameWithOverwrite(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time elapsed: 78.543 sec  <<< FAILURE!
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertFalse(Assert.java:64)
	at org.junit.Assert.assertFalse(Assert.java:74)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:259)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testConservativeConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:192)

        at java.util.concurrent.SynchronousQueue.poll(Synchron
Running org.apache.hadoop.hdfs.TestDecommissionWithStriped
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.658 sec - in org.apache.hadoop.hdfs.TestGetBlocks
Running org.apache.hadoop.hdfs.TestDataTransferProtocol
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.502 sec - in org.apache.hadoop.hdfs.TestDataTransferProtocol
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.144 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.917 sec - in org.apache.hadoop.hdfs.TestDecommissionWithStriped
Running org.apache.hadoop.hdfs.TestKeyProviderCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.529 sec - in org.apache.hadoop.hdfs.TestKeyProviderCache
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 436.88 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Running org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.627 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.455 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.765 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 100.991 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.33 sec - in org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.306 sec - in org.apache.hadoop.TestRefreshCallQueue
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.112 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 76.03 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 136.584 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure

Results :

Failed tests: 
  TestShortCircuitCache.testDataXceiverCleansUpSlotsOnFailure:682->checkNumberOfSegmentsAndSlots:628 expected:<1> but was:<2>
  TestHAAppend.testMultipleAppendsDuringCatchupTailing:125 inode should complete in ~60000 ms.
Expected: is <true>
     but: was <false>
  TestAsyncDFSRename.testConservativeConcurrentAsyncRenameWithOverwrite:192->internalTestConcurrentAsyncRenameWithOverwrite:259 null

Tests in error: 
  TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI:289->internalTestConcurrentAsyncAPI:428 » 
  TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite:199->internalTestConcurrentAsyncRenameWithOverwrite:226 » 
  TestAsyncDFSRename.testCallGetReturnValueMultipleTimes:134 » IO Cannot remove ...

Tests run: 4427, Failures: 3, Errors: 3, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:11 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:22 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.098 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:26 h
[INFO] Finished at: 2016-05-23T22:12:11+00:00
[INFO] Final Memory: 96M/3800M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Hadoop-Hdfs-trunk-Java8 - Build # 1253 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1253/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 7528 lines...]
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:11 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:22 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.098 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:26 h
[INFO] Finished at: 2016-05-23T22:12:11+00:00
[INFO] Final Memory: 96M/3800M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI

Error Message:
test timed out after 60000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Throwable.getStackTraceElement(Native Method)
	at java.lang.Throwable.getOurStackTrace(Throwable.java:827)
	at java.lang.Throwable.getStackTrace(Throwable.java:816)
	at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.log4j.spi.LocationInfo.<init>(LocationInfo.java:139)
	at org.apache.log4j.spi.LoggingEvent.getLocationInformation(LoggingEvent.java:253)
	at org.apache.log4j.helpers.PatternParser$LocationPatternConverter.convert(PatternParser.java:500)
	at org.apache.log4j.helpers.PatternConverter.format(PatternConverter.java:65)
	at org.apache.log4j.PatternLayout.format(PatternLayout.java:506)
	at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:310)
	at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
	at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
	at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
	at org.apache.log4j.Category.callAppenders(Category.java:206)
	at org.apache.log4j.Category.forcedLog(Category.java:391)
	at org.apache.log4j.Category.log(Category.java:856)
	at org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:176)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shouldWait(MiniDFSCluster.java:2568)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2487)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2530)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNodes(MiniDFSCluster.java:1995)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncAPI(TestAsyncDFSRename.java:428)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI(TestAsyncDFSRename.java:289)


FAILED:  org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite

Error Message:
test timed out after 60000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:825)
	at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:784)
	at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:755)
	at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
	at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:101)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:417)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:375)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:368)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:361)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:226)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:199)


FAILED:  org.apache.hadoop.hdfs.TestAsyncDFSRename.testCallGetReturnValueMultipleTimes

Error Message:
Cannot remove data directory: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/datapath '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8': 
 absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8
 permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
 absolute:/home/jenkins/jenkins-slave/workspace
 permissions: drwx
path '/home/jenkins/jenkins-slave': 
 absolute:/home/jenkins/jenkins-slave
 permissions: drwx
path '/home/jenkins': 
 absolute:/home/jenkins
 permissions: drwx
path '/home': 
 absolute:/home
 permissions: dr-x
path '/': 
 absolute:/
 permissions: dr-x


Stack Trace:
java.io.IOException: Cannot remove data directory: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/datapath '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/test
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8': 
	absolute:/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
	absolute:/home/jenkins/jenkins-slave/workspace
	permissions: drwx
path '/home/jenkins/jenkins-slave': 
	absolute:/home/jenkins/jenkins-slave
	permissions: drwx
path '/home/jenkins': 
	absolute:/home/jenkins
	permissions: drwx
path '/home': 
	absolute:/home
	permissions: dr-x
path '/': 
	absolute:/
	permissions: dr-x

	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:490)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:449)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testCallGetReturnValueMultipleTimes(TestAsyncDFSRename.java:134)


FAILED:  org.apache.hadoop.hdfs.TestAsyncDFSRename.testConservativeConcurrentAsyncRenameWithOverwrite

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertFalse(Assert.java:64)
	at org.junit.Assert.assertFalse(Assert.java:74)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:259)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testConservativeConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:192)


FAILED:  org.apache.hadoop.hdfs.server.namenode.ha.TestHAAppend.testMultipleAppendsDuringCatchupTailing

Error Message:
inode should complete in ~60000 ms.
Expected: is <true>
     but: was <false>

Stack Trace:
java.lang.AssertionError: inode should complete in ~60000 ms.
Expected: is <true>
     but: was <false>
	at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
	at org.junit.Assert.assertThat(Assert.java:865)
	at org.apache.hadoop.hdfs.server.namenode.TestFileTruncate.checkBlockRecovery(TestFileTruncate.java:1196)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestHAAppend.testMultipleAppendsDuringCatchupTailing(TestHAAppend.java:125)


FAILED:  org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitCache.testDataXceiverCleansUpSlotsOnFailure

Error Message:
expected:<1> but was:<2>

Stack Trace:
java.lang.AssertionError: expected:<1> but was:<2>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.junit.Assert.assertEquals(Assert.java:542)
	at org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitCache$17.accept(TestShortCircuitCache.java:633)
	at org.apache.hadoop.hdfs.server.datanode.ShortCircuitRegistry.visit(ShortCircuitRegistry.java:403)
	at org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitCache.checkNumberOfSegmentsAndSlots(TestShortCircuitCache.java:628)
	at org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitCache.testDataXceiverCleansUpSlotsOnFailure(TestShortCircuitCache.java:682)




Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1252

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1252/changes>

Changes:

[jlowe] YARN-5055. max apps per user can be larger than max per queue.

------------------------------------------
[...truncated 5236 lines...]
Running org.apache.hadoop.hdfs.TestReconstructStripedFile
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.337 sec - in org.apache.hadoop.hdfs.TestLeaseRecovery2
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStream
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.565 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure050
Running org.apache.hadoop.hdfs.TestDFSInputStream
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.305 sec - in org.apache.hadoop.hdfs.TestDFSInputStream
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.714 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestFileAppend4
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.033 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStream
Running org.apache.hadoop.hdfs.TestParallelRead
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 154.356 sec - in org.apache.hadoop.hdfs.TestWriteReadStripedFile
Running org.apache.hadoop.hdfs.TestClose
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.575 sec - in org.apache.hadoop.hdfs.TestClose
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure040
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 111.262 sec - in org.apache.hadoop.hdfs.TestReconstructStripedFile
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.591 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.208 sec - in org.apache.hadoop.hdfs.TestFileAppend4
Running org.apache.hadoop.hdfs.TestParallelShortCircuitLegacyRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.231 sec - in org.apache.hadoop.hdfs.TestParallelRead
Running org.apache.hadoop.hdfs.TestLargeBlock
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.159 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitLegacyRead
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.622 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.715 sec - in org.apache.hadoop.hdfs.TestLargeBlock
Running org.apache.hadoop.hdfs.TestWriteRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.338 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure040
Running org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.349 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestBalancerBandwidth
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.347 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.456 sec - in org.apache.hadoop.hdfs.TestBalancerBandwidth
Running org.apache.hadoop.hdfs.TestBlockStoragePolicy
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.874 sec - in org.apache.hadoop.hdfs.TestWriteRead
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.915 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestLeaseRecoveryStriped
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.844 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocal
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 65.338 sec - in org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocalLegacy
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.018 sec - in org.apache.hadoop.hdfs.TestLeaseRecoveryStriped
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote2
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.817 sec - in org.apache.hadoop.hdfs.TestBlockStoragePolicy
Running org.apache.hadoop.hdfs.client.impl.TestClientBlockVerification
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.997 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocalLegacy
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.012 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote2
Running org.apache.hadoop.hdfs.client.impl.TestBlockReaderFactory
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.712 sec - in org.apache.hadoop.hdfs.client.impl.TestClientBlockVerification
Running org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.134 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderRemote
Running org.apache.hadoop.hdfs.qjournal.TestNNWithQJM
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.519 sec - in org.apache.hadoop.hdfs.qjournal.TestNNWithQJM
Running org.apache.hadoop.hdfs.qjournal.server.TestJournalNodeMXBean
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.626 sec - in org.apache.hadoop.hdfs.qjournal.server.TestJournalNodeMXBean
Running org.apache.hadoop.hdfs.qjournal.server.TestJournal
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.317 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderFactory
Running org.apache.hadoop.hdfs.qjournal.server.TestJournalNode
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.624 sec - in org.apache.hadoop.hdfs.qjournal.server.TestJournal
Running org.apache.hadoop.hdfs.qjournal.client.TestIPCLoggerChannel
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.934 sec - in org.apache.hadoop.hdfs.qjournal.client.TestIPCLoggerChannel
Running org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 70.418 sec - in org.apache.hadoop.hdfs.client.impl.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.qjournal.client.TestEpochsAreUnique
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.859 sec - in org.apache.hadoop.hdfs.qjournal.client.TestEpochsAreUnique
Running org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManagerUnit
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.596 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManagerUnit
Running org.apache.hadoop.hdfs.qjournal.client.TestSegmentRecoveryComparator
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.254 sec - in org.apache.hadoop.hdfs.qjournal.client.TestSegmentRecoveryComparator
Running org.apache.hadoop.hdfs.qjournal.client.TestQuorumCall
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.426 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQuorumCall
Running org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManager
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.061 sec - in org.apache.hadoop.hdfs.qjournal.server.TestJournalNode
Running org.apache.hadoop.hdfs.qjournal.TestMiniJournalCluster
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.84 sec - in org.apache.hadoop.hdfs.qjournal.TestMiniJournalCluster
Running org.apache.hadoop.hdfs.TestAsyncDFSRename
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.37 sec - in org.apache.hadoop.hdfs.qjournal.TestSecureNNWithQJM
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.565 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.409 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure200
Running org.apache.hadoop.hdfs.TestBlockMissingException
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.92 sec - in org.apache.hadoop.hdfs.TestBlockMissingException
Running org.apache.hadoop.hdfs.TestDataTransferKeepalive
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.897 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQuorumJournalManager
Running org.apache.hadoop.hdfs.TestModTime
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.01 sec - in org.apache.hadoop.hdfs.TestDataTransferKeepalive
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure210
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.341 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure210
Running org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.973 sec - in org.apache.hadoop.hdfs.TestModTime
Running org.apache.hadoop.hdfs.TestLease
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.875 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade
Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.739 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestDeprecatedKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.584 sec - in org.apache.hadoop.hdfs.TestDeprecatedKeys
Running org.apache.hadoop.hdfs.TestFSOutputSummer
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.299 sec - in org.apache.hadoop.hdfs.TestFSOutputSummer
Running org.apache.hadoop.hdfs.TestRead
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.246 sec - in org.apache.hadoop.hdfs.TestRead
Running org.apache.hadoop.hdfs.TestFileStatusWithECPolicy
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.031 sec - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.349 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150
Running org.apache.hadoop.hdfs.TestHDFSPolicyProvider
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.214 sec - in org.apache.hadoop.hdfs.TestHDFSPolicyProvider
Running org.apache.hadoop.hdfs.TestAppendSnapshotTruncate
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.518 sec - in org.apache.hadoop.hdfs.TestFileStatusWithECPolicy
Running org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.778 sec - in org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.39 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.398 sec - in org.apache.hadoop.hdfs.TestAppendSnapshotTruncate
Running org.apache.hadoop.hdfs.TestGetBlocks
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.457 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.344 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Running org.apache.hadoop.hdfs.TestDecommissionWithStriped
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.53 sec - in org.apache.hadoop.hdfs.TestGetBlocks
Running org.apache.hadoop.hdfs.TestDataTransferProtocol
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.934 sec - in org.apache.hadoop.hdfs.TestDataTransferProtocol
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.389 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 209.074 sec - in org.apache.hadoop.hdfs.TestAsyncDFSRename
Running org.apache.hadoop.hdfs.TestKeyProviderCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.488 sec - in org.apache.hadoop.hdfs.TestKeyProviderCache
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 60.144 sec - in org.apache.hadoop.hdfs.TestDecommissionWithStriped
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.443 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.934 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.309 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.353 sec - in org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.175 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.949 sec - in org.apache.hadoop.TestRefreshCallQueue
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.847 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 325.916 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 123.133 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure

Results :

Failed tests: 
  TestEditLog.testBatchedSyncWithClosedLogs:594 logging edit without syncing should do not affect txid expected:<1> but was:<2>

Tests run: 4427, Failures: 1, Errors: 0, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:06 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:11 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.079 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:15 h
[INFO] Finished at: 2016-05-23T18:23:23+00:00
[INFO] Final Memory: 96M/3577M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1251

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1251/changes>

Changes:

[jlowe] YARN-5103. With NM recovery enabled, restarting NM multiple times

------------------------------------------
[...truncated 7173 lines...]
        at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
        at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
        at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:101)
        at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:848)
        at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:827)
"IPC Client (984930046) connection to localhost/127.0.0.1:46752 from jenkins" daemon prio=5 tid=13478 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at java.lang.Object.wait(Native Method)
        at org.apache.hadoop.ipc.Client$Connection.waitForWork(Client.java:989)
        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1034)
"refreshUsed-<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/3/dfs/data/data4/current/BP-1596740347-67.195.81.152-1464022707286"> daemon prio=5 tid=13501 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at java.lang.Thread.sleep(Native Method)
        at org.apache.hadoop.fs.CachingGetSpaceUsed$RefreshThread.run(CachingGetSpaceUsed.java:158)
        at java.lang.Thread.run(Thread.java:745)
"VolumeScannerThread(<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/3/dfs/data/data2)"> daemon prio=5 tid=13481 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at java.lang.Object.wait(Native Method)
        at org.apache.hadoop.hdfs.server.datanode.VolumeScanner.run(VolumeScanner.java:613)
"nioEventLoopGroup-22-1"  prio=10 tid=13458 runnable
java.lang.Thread.State: RUNNABLE
        at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
        at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
        at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
        at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
        at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
        at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:621)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:309)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:703)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
        at java.lang.Thread.run(Thread.java:745)
"Timer-55" daemon prio=5 tid=13384 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at java.lang.Object.wait(Native Method)
        at java.util.TimerThread.mainLoop(Timer.java:552)
        at java.util.TimerThread.run(Timer.java:505)
"refreshUsed-<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/3/dfs/data/data2/current/BP-1596740347-67.195.81.152-1464022707286"> daemon prio=5 tid=13488 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at java.lang.Thread.sleep(Native Method)
        at org.apache.hadoop.fs.CachingGetSpaceUsed$RefreshThread.run(CachingGetSpaceUsed.java:158)
        at java.lang.Thread.run(Thread.java:745)
"IPC Server handler 3 on 46752" daemon prio=5 tid=13403 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at sun.misc.Unsafe.park(Native Method)
        at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
        at java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:467)
        at org.apache.hadoop.ipc.CallQueueManager.take(CallQueueManager.java:218)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2387)
"IPC Server handler 4 on 50909" daemon prio=5 tid=13442 timed_waiting
java.lang.Thread.State: TIMED_WAITING
        at sun.misc.Unsafe.park(Native Method)
        at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
        at java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:467)
        at org.apache.hadoop.ipc.CallQueueManager.take(CallQueueManager.java:218)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2387)
"1722691704@qtp-866737370-1 - Acceptor0 SelectChannelConnector@localhost:57530" daemon prio=5 tid=13423 runnable
java.lang.Thread.State: RUNNABLE
        at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
        at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
        at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
        at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
        at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
        at org.mortbay.io.nio.SelectorManager$SelectSet.doSelect(SelectorManager.java:498)
        at org.mortbay.io.nio.SelectorManager.doSelect(SelectorManager.java:192)
        at org.mortbay.jetty.nio.SelectChannelConnector.accept(SelectChannelConnector.java:124)
        at org.mortbay.jetty.AbstractConnector$Acceptor.run(AbstractConnector.java:708)
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.72 sec - in org.apache.hadoop.hdfs.TestAppendSnapshotTruncate
Running org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.093 sec - in org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 126.385 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure150
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.53 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.TestGetBlocks
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.727 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.488 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Running org.apache.hadoop.hdfs.TestDecommissionWithStriped
Tests run: 7, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 288.423 sec <<< FAILURE! - in org.apache.hadoop.hdfs.TestAsyncDFSRename
testAggressiveConcurrentAsyncAPI(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time elapsed: 60.025 sec  <<< ERROR!
java.lang.Exception: test timed out after 60000 milliseconds
	at org.apache.hadoop.hdfs.server.namenode.FSEditLogLoader.loadEditRecords(FSEditLogLoader.java:260)
	at org.apache.hadoop.hdfs.server.namenode.FSEditLogLoader.loadFSEdits(FSEditLogLoader.java:149)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.loadEdits(FSImage.java:837)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSImage(FSImage.java:694)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:290)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:990)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:659)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:650)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:712)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:928)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:907)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1624)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.java:2038)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNodes(MiniDFSCluster.java:1993)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncAPI(TestAsyncDFSRename.java:428)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI(TestAsyncDFSRename.java:289)

testAggressiveConcurrentAsyncRenameWithOverwrite(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time elapsed: 60.014 sec  <<< ERROR!
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Object.wait(Native Method)
	at org.apache.hadoop.hdfs.DataStreamer.waitForAckedSeqno(DataStreamer.java:772)
	at org.apache.hadoop.hdfs.DFSOutputStream.flushInternal(DFSOutputStream.java:697)
	at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:778)
	at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:755)
	at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
	at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:101)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:417)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:375)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:368)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:361)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:226)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:199)

        at org.mortbay.thread.QueuedThreadPool$PoolThread.run(Queue
Running org.apache.hadoop.hdfs.TestDataTransferProtocol
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.711 sec - in org.apache.hadoop.hdfs.TestGetBlocks
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.845 sec - in org.apache.hadoop.hdfs.TestDataTransferProtocol
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 363.949 sec - in org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Running org.apache.hadoop.hdfs.TestKeyProviderCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.792 sec - in org.apache.hadoop.hdfs.TestKeyProviderCache
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.309 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.61 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.699 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.413 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.556 sec - in org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.431 sec - in org.apache.hadoop.hdfs.TestDecommissionWithStriped
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.669 sec - in org.apache.hadoop.TestRefreshCallQueue
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.445 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.612 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 130.511 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure

Results :

Failed tests: 
  TestParallelShortCircuitReadUnCached>TestParallelReadUtil.testParallelReadByteBuffer:411->TestParallelReadUtil.runTestWorkload:386 Check log for errors
  TestParallelShortCircuitReadUnCached>TestParallelReadUtil.testParallelReadMixed:416->TestParallelReadUtil.runTestWorkload:383 Check log for errors
  TestParallelShortCircuitReadUnCached>TestParallelReadUtil.testParallelNoChecksums:422->TestParallelReadUtil.runTestWorkload:383 Check log for errors
  TestParallelShortCircuitReadUnCached>TestParallelReadUtil.testParallelReadCopying:406->TestParallelReadUtil.runTestWorkload:383 Check log for errors

Tests in error: 
  TestOpenFilesWithSnapshot.testParentDirWithUCFileDeleteWithSnapShot:82 » IO Ti...
  TestPendingInvalidateBlock.testPendingDeleteUnknownBlocks:160 » Bind Problem b...
  TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI:289->internalTestConcurrentAsyncAPI:428 » 
  TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite:199->internalTestConcurrentAsyncRenameWithOverwrite:226->Object.wait:-2 » 

Tests run: 4427, Failures: 4, Errors: 4, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [03:54 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:14 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.572 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:18 h
[INFO] Finished at: 2016-05-23T17:04:15+00:00
[INFO] Final Memory: 94M/3566M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Hadoop-Hdfs-trunk-Java8 - Build # 1251 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1251/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 7370 lines...]
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [03:54 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:14 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.572 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:18 h
[INFO] Finished at: 2016-05-23T17:04:15+00:00
[INFO] Final Memory: 94M/3566M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3



###################################################################################
############################## FAILED TESTS (if any) ##############################
8 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI

Error Message:
test timed out after 60000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 60000 milliseconds
	at org.apache.hadoop.hdfs.server.namenode.FSEditLogLoader.loadEditRecords(FSEditLogLoader.java:260)
	at org.apache.hadoop.hdfs.server.namenode.FSEditLogLoader.loadFSEdits(FSEditLogLoader.java:149)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.loadEdits(FSImage.java:837)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSImage(FSImage.java:694)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:290)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:990)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:659)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:650)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:712)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:928)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:907)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1624)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.java:2038)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNodes(MiniDFSCluster.java:1993)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncAPI(TestAsyncDFSRename.java:428)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI(TestAsyncDFSRename.java:289)


FAILED:  org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite

Error Message:
test timed out after 60000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Object.wait(Native Method)
	at org.apache.hadoop.hdfs.DataStreamer.waitForAckedSeqno(DataStreamer.java:772)
	at org.apache.hadoop.hdfs.DFSOutputStream.flushInternal(DFSOutputStream.java:697)
	at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:778)
	at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:755)
	at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
	at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:101)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:417)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:375)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:368)
	at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:361)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:226)
	at org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:199)


FAILED:  org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.testParallelReadByteBuffer

Error Message:
Check log for errors

Stack Trace:
java.lang.AssertionError: Check log for errors
	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.runTestWorkload(TestParallelReadUtil.java:386)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.testParallelReadByteBuffer(TestParallelReadUtil.java:411)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.testParallelReadMixed

Error Message:
Check log for errors

Stack Trace:
java.lang.AssertionError: Check log for errors
	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.runTestWorkload(TestParallelReadUtil.java:383)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.testParallelReadMixed(TestParallelReadUtil.java:416)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.testParallelNoChecksums

Error Message:
Check log for errors

Stack Trace:
java.lang.AssertionError: Check log for errors
	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.runTestWorkload(TestParallelReadUtil.java:383)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.testParallelNoChecksums(TestParallelReadUtil.java:422)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached.testParallelReadCopying

Error Message:
Check log for errors

Stack Trace:
java.lang.AssertionError: Check log for errors
	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.runTestWorkload(TestParallelReadUtil.java:383)
	at org.apache.hadoop.hdfs.TestParallelReadUtil.testParallelReadCopying(TestParallelReadUtil.java:406)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.server.blockmanagement.TestPendingInvalidateBlock.testPendingDeleteUnknownBlocks

Error Message:
Problem binding to [localhost:40111] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException

Stack Trace:
java.net.BindException: Problem binding to [localhost:40111] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:433)
	at sun.nio.ch.Net.bind(Net.java:425)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.apache.hadoop.ipc.Server.bind(Server.java:530)
	at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:793)
	at org.apache.hadoop.ipc.Server.<init>(Server.java:2592)
	at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:958)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:559)
	at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:534)
	at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:800)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.initIpcServer(DataNode.java:932)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1297)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:479)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2585)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2473)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2520)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartDataNode(MiniDFSCluster.java:2260)
	at org.apache.hadoop.hdfs.server.blockmanagement.TestPendingInvalidateBlock.testPendingDeleteUnknownBlocks(TestPendingInvalidateBlock.java:160)


FAILED:  org.apache.hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot.testParentDirWithUCFileDeleteWithSnapShot

Error Message:
Timed out waiting for Mini HDFS Cluster to start

Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitClusterUp(MiniDFSCluster.java:1363)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.java:2042)
	at org.apache.hadoop.hdfs.MiniDFSCluster.restartNameNode(MiniDFSCluster.java:2003)
	at org.apache.hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot.testParentDirWithUCFileDeleteWithSnapShot(TestOpenFilesWithSnapshot.java:82)




Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1250

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1250/changes>

Changes:

[junping_du] YARN-5112. Excessive log warnings for directory permission issue on NM

------------------------------------------
[...truncated 5266 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.884 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.691 sec - in org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 90.772 sec - in org.apache.hadoop.hdfs.server.mover.TestMover
Running org.apache.hadoop.hdfs.TestPipelines
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.721 sec - in org.apache.hadoop.hdfs.TestPipelines
Running org.apache.hadoop.hdfs.TestHttpPolicy
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.542 sec - in org.apache.hadoop.hdfs.TestHttpPolicy
Running org.apache.hadoop.hdfs.TestEncryptionZonesWithHA
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.229 sec - in org.apache.hadoop.hdfs.TestEncryptionZonesWithHA
Running org.apache.hadoop.hdfs.TestDFSClientSocketSize
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.49 sec - in org.apache.hadoop.hdfs.TestDFSClientSocketSize
Running org.apache.hadoop.hdfs.TestWriteRead
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.617 sec - in org.apache.hadoop.hdfs.TestWriteRead
Running org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.023 sec - in org.apache.hadoop.hdfs.TestDFSInotifyEventInputStream
Running org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 111.691 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestReplaceDatanodeOnFailure
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 185.956 sec - in org.apache.hadoop.hdfs.server.mover.TestStorageMover
Running org.apache.hadoop.hdfs.TestPersistBlocks
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.324 sec - in org.apache.hadoop.hdfs.TestReplaceDatanodeOnFailure
Running org.apache.hadoop.hdfs.TestFSInputChecker
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.87 sec - in org.apache.hadoop.hdfs.TestFSInputChecker
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.692 sec - in org.apache.hadoop.hdfs.TestPersistBlocks
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.966 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.036 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.427 sec - in org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 37, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 5.421 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.853 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.654 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.706 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Running org.apache.hadoop.fs.TestUnbuffer
Tests run: 74, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 13.662 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.349 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.55 sec - in org.apache.hadoop.fs.TestUnbuffer
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.183 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.183 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.326 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestSWebHdfsFileContextMainOperations
Tests run: 10, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 16.93 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.456 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestWebHdfsFileContextMainOperations
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.673 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.viewfs.TestViewFsWithAcls
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.668 sec - in org.apache.hadoop.fs.TestSWebHdfsFileContextMainOperations
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.347 sec - in org.apache.hadoop.fs.viewfs.TestViewFsWithAcls
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.7 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsWithXAttrs
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.295 sec - in org.apache.hadoop.fs.TestWebHdfsFileContextMainOperations
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.934 sec - in org.apache.hadoop.fs.viewfs.TestViewFsWithXAttrs
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.2 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemWithAcls
Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 312.008 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancer
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.63 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemWithXAttrs
Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.836 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.232 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemWithAcls
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.118 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemWithXAttrs
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.427 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractDelete
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSetTimes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.286 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSetTimes
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.009 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractDelete
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractMkdir
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.676 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractMkdir
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.607 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRootDirectory
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractAppend
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.798 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate
Tests run: 71, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.26 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSeek
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.198 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRootDirectory
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRename
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.464 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractAppend
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractGetFileStatus
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.284 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractConcat
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.505 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRename
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.235 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractGetFileStatus
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.285 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSeek
Running org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.035 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractConcat
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.918 sec - in org.apache.hadoop.TestRefreshCallQueue
Running org.apache.hadoop.security.TestPermissionSymlinks
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.226 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.security.TestRefreshUserMappings
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.49 sec - in org.apache.hadoop.security.TestPermissionSymlinks
Running org.apache.hadoop.security.TestPermission
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.986 sec - in org.apache.hadoop.security.TestRefreshUserMappings
Running org.apache.hadoop.tools.TestTools
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.454 sec - in org.apache.hadoop.tools.TestTools
Running org.apache.hadoop.tools.TestJMXGet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.832 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Running org.apache.hadoop.tools.TestHdfsConfigFields
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.233 sec - in org.apache.hadoop.security.TestPermission
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.664 sec - in org.apache.hadoop.tools.TestHdfsConfigFields
Running org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.972 sec - in org.apache.hadoop.tools.TestJMXGet
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.3 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.238 sec - in org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.TestGenericRefresh
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.046 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.695 sec - in org.apache.hadoop.TestGenericRefresh
Running org.apache.hadoop.cli.TestCacheAdminCLI
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.7 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.cli.TestAclCLI
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.361 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Running org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.703 sec - in org.apache.hadoop.cli.TestCacheAdminCLI
Running org.apache.hadoop.cli.TestCryptoAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.965 sec - in org.apache.hadoop.cli.TestAclCLI
Running org.apache.hadoop.cli.TestXAttrCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.072 sec - in org.apache.hadoop.cli.TestErasureCodingCLI
Running org.apache.hadoop.cli.TestDeleteCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.714 sec - in org.apache.hadoop.cli.TestCryptoAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.441 sec - in org.apache.hadoop.cli.TestXAttrCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.126 sec - in org.apache.hadoop.cli.TestDeleteCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.141 sec - in org.apache.hadoop.cli.TestHDFSCLI

Results :

Tests in error: 
  TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage:628->upgradeAndVerify:606->verifyFileSystem:229->verifyDir:214->dfsOpenFileWithRetries:178 » IO
  TestDataNodeMultipleRegistrations.testDNWithInvalidStorageWithHA:293 » Bind Pr...

Tests run: 4427, Failures: 0, Errors: 2, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [03:58 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [57:26 min]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.104 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:01 h
[INFO] Finished at: 2016-05-23T00:50:24+00:00
[INFO] Final Memory: 96M/3607M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1249

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1249/changes>

Changes:

[aajisaka] MAPREDUCE-6607. Enable regex pattern matching when

------------------------------------------
[...truncated 5247 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.914 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.154 sec - in org.apache.hadoop.hdfs.TestDataTransferKeepalive
Running org.apache.hadoop.hdfs.TestDatanodeDeath
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.625 sec - in org.apache.hadoop.hdfs.TestDatanodeDeath
Running org.apache.hadoop.hdfs.TestHDFSFileSystemContract
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 94.889 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 112.629 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDisableConnCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.627 sec - in org.apache.hadoop.hdfs.TestDisableConnCache
Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 148.35 sec - in org.apache.hadoop.hdfs.TestDFSClientRetries
Running org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.966 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestHDFSPolicyProvider
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.26 sec - in org.apache.hadoop.hdfs.TestHDFSPolicyProvider
Running org.apache.hadoop.hdfs.TestWriteReadStripedFile
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.71 sec - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.872 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 56.402 sec - in org.apache.hadoop.hdfs.TestClientProtocolForPipelineRecovery
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.857 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.651 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestDFSClientSocketSize
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.901 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 44, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 124.216 sec - in org.apache.hadoop.hdfs.TestHDFSFileSystemContract
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestSnapshotCommands
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.935 sec - in org.apache.hadoop.hdfs.TestDFSClientSocketSize
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.702 sec - in org.apache.hadoop.hdfs.TestSnapshotCommands
Running org.apache.hadoop.hdfs.TestDFSPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.376 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.645 sec - in org.apache.hadoop.hdfs.TestDFSPermission
Running org.apache.hadoop.hdfs.TestParallelRead
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 132.4 sec - in org.apache.hadoop.hdfs.TestWriteReadStripedFile
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStream
ERROR: Could not install LATEST1_8_HOME
java.lang.NullPointerException
	at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
	at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:947)
	at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:390)
	at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:577)
	at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:527)
	at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:381)
	at hudson.scm.SCM.poll(SCM.java:398)
	at hudson.model.AbstractProject._poll(AbstractProject.java:1453)
	at hudson.model.AbstractProject.poll(AbstractProject.java:1356)
	at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:526)
	at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:555)
	at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
ERROR: Could not install MAVEN_3_3_3_HOME
java.lang.NullPointerException
	at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
	at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:947)
	at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:390)
	at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:577)
	at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:527)
	at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:381)
	at hudson.scm.SCM.poll(SCM.java:398)
	at hudson.model.AbstractProject._poll(AbstractProject.java:1453)
	at hudson.model.AbstractProject.poll(AbstractProject.java:1356)
	at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:526)
	at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:555)
	at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.526 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Running org.apache.hadoop.hdfs.TestAclsEndToEnd
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.6 sec - in org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.46 sec - in org.apache.hadoop.hdfs.TestParallelRead
Running org.apache.hadoop.hdfs.TestDFSConfigKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.42 sec - in org.apache.hadoop.hdfs.TestDFSConfigKeys
Running org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.396 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.091 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeDowngrade
Running org.apache.hadoop.hdfs.TestDFSUtil
Running org.apache.hadoop.hdfs.TestDatanodeStartupFixesLegacyStorageIDs
Tests run: 31, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.76 sec - in org.apache.hadoop.hdfs.TestDFSUtil
Running org.apache.hadoop.hdfs.TestExternalBlockReader
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.872 sec - in org.apache.hadoop.hdfs.TestDatanodeStartupFixesLegacyStorageIDs
Running org.apache.hadoop.hdfs.TestDeprecatedKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.592 sec - in org.apache.hadoop.hdfs.TestDeprecatedKeys
Running org.apache.hadoop.hdfs.TestGetBlocks
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.843 sec - in org.apache.hadoop.hdfs.TestExternalBlockReader
Running org.apache.hadoop.hdfs.TestRead
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.647 sec - in org.apache.hadoop.hdfs.TestRead
Running org.apache.hadoop.hdfs.TestParallelReadUtil
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.029 sec - in org.apache.hadoop.hdfs.TestParallelReadUtil
Running org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitCache
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.53 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStream
Running org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.912 sec - in org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitCache
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.739 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.443 sec - in org.apache.hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.05 sec - in org.apache.hadoop.hdfs.TestAclsEndToEnd
Running org.apache.hadoop.hdfs.TestReconstructStripedFile
Running org.apache.hadoop.hdfs.TestSafeModeWithStripedFile
Running org.apache.hadoop.hdfs.TestEncryptionZonesWithKMS
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.29 sec - in org.apache.hadoop.hdfs.TestGetBlocks
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.431 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestBlockMissingException
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.158 sec - in org.apache.hadoop.hdfs.TestBlockMissingException
Running org.apache.hadoop.hdfs.TestReplication
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.729 sec - in org.apache.hadoop.hdfs.TestSafeModeWithStripedFile
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.201 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090
Running org.apache.hadoop.hdfs.TestFileAppend
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.687 sec - in org.apache.hadoop.hdfs.TestReplication
Running org.apache.hadoop.hdfs.TestDistributedFileSystem
Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.718 sec - in org.apache.hadoop.hdfs.TestEncryptionZonesWithKMS
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.762 sec - in org.apache.hadoop.hdfs.TestReconstructStripedFile
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure160
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.439 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure160
Running org.apache.hadoop.hdfs.TestReservedRawPaths
Running org.apache.hadoop.hdfs.TestMissingBlocksAlert
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.21 sec - in org.apache.hadoop.hdfs.TestMissingBlocksAlert
Running org.apache.hadoop.hdfs.TestClose
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.134 sec - in org.apache.hadoop.hdfs.TestReservedRawPaths
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.232 sec - in org.apache.hadoop.hdfs.TestClose
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure120
Running org.apache.hadoop.hdfs.TestFileConcurrentReader
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.474 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure120
Running org.apache.hadoop.hdfs.TestHDFSServerPorts
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.202 sec - in org.apache.hadoop.hdfs.TestHDFSServerPorts
Running org.apache.hadoop.tracing.TestTracing
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.334 sec - in org.apache.hadoop.hdfs.TestFileAppend
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.028 sec - in org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.726 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.549 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.997 sec - in org.apache.hadoop.hdfs.TestFileConcurrentReader
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.318 sec - in org.apache.hadoop.hdfs.TestDistributedFileSystem

Results :

Tests in error: 
  TestLargeBlockReport.testBlockReportSucceedsWithLargerLengthLimit:97 NullPointer
  TestDFSUpgradeFromImage.testUpgradeFromRel1BBWImage:628->upgradeAndVerify:606->verifyFileSystem:229->verifyDir:214->dfsOpenFileWithRetries:178 » IO

Tests run: 4427, Failures: 0, Errors: 2, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [03:57 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [57:08 min]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.077 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:01 h
[INFO] Finished at: 2016-05-22T23:46:45+00:00
[INFO] Final Memory: 98M/3679M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org