You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/03/25 23:46:44 UTC

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1026

See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1026/changes>

Changes:

[jlowe] MAPREDUCE-6535. TaskID default constructor results in NPE on toString().

[jlowe] YARN-4814. ATS 1.5 timelineclient impl call flush after every event

------------------------------------------
[...truncated 4496 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.TestDataXceiverLazyPersistHint
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.013 sec - in org.apache.hadoop.hdfs.server.datanode.TestDataXceiverLazyPersistHint
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.TestBatchIbr
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.967 sec - in org.apache.hadoop.hdfs.server.datanode.TestBatchIbr
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.web.webhdfs.TestDataNodeUGIProvider
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.555 sec - in org.apache.hadoop.hdfs.server.datanode.web.webhdfs.TestDataNodeUGIProvider
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.web.webhdfs.TestParameterParser
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.59 sec - in org.apache.hadoop.hdfs.server.datanode.web.webhdfs.TestParameterParser
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.web.dtp.TestDtpHttp2
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.151 sec - in org.apache.hadoop.hdfs.server.datanode.web.dtp.TestDtpHttp2
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.57 sec - in org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.datanode.TestFsDatasetCacheRevocation
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.294 sec - in org.apache.hadoop.hdfs.server.datanode.TestFsDatasetCacheRevocation
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockManager
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.677 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestComputeInvalidateWork
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.116 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestComputeInvalidateWork
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithNodeGroup
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.518 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithNodeGroup
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestReconstructStripedBlocksWithRackAwareness
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.724 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestReconstructStripedBlocksWithRackAwareness
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicyConsiderLoad
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.835 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicyConsiderLoad
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestNodeCount
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.235 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestNodeCount
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingInvalidateBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.444 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestPendingInvalidateBlock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockInfoStriped
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.31 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockInfoStriped
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFSStriped
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.028 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFSStriped
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.885 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlocksWithNotEnoughRacks
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.918 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlocksWithNotEnoughRacks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockUnderConstructionFeature
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.371 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockUnderConstructionFeature
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestAvailableSpaceBlockPlacementPolicy
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.044 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestAvailableSpaceBlockPlacementPolicy
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestDatanodeDescriptor
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.278 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestDatanodeDescriptor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.672 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockInfo
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.404 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockInfo
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestHeartbeatHandling
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.738 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestHeartbeatHandling
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestCachedBlocksList
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.259 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestCachedBlocksList
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestHostFileManager
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.053 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestHostFileManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestRBWBlockInvalidation
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.135 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestRBWBlockInvalidation
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestSequentialBlockGroupId
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.568 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestSequentialBlockGroupId
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingDataNodeMessages
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.313 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestPendingDataNodeMessages
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestCorruptReplicaInfo
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.348 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestCorruptReplicaInfo
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.427 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestNameNodePrunesMissingStorages
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.153 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestNameNodePrunesMissingStorages
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestLowRedundancyBlockQueues
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.266 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestLowRedundancyBlockQueues
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestHost2NodesMap
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.28 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestHost2NodesMap
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestDatanodeManager
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.766 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestDatanodeManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFS
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.749 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFS
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestSequentialBlockId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.731 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestSequentialBlockId
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.569 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockReportRateLimiting
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.133 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockReportRateLimiting
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
Tests run: 62, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.661 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.942 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockStatsMXBean
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.blockmanagement.TestBlockManagerSafeMode
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.935 sec - in org.apache.hadoop.hdfs.server.blockmanagement.TestBlockManagerSafeMode
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.common.TestGetUriFromString
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.187 sec - in org.apache.hadoop.hdfs.server.common.TestGetUriFromString
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.common.TestJspHelper
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.303 sec - in org.apache.hadoop.hdfs.server.common.TestJspHelper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.mover.TestStorageMover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 187.135 sec - in org.apache.hadoop.hdfs.server.mover.TestStorageMover
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.mover.TestMover
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 92.11 sec - in org.apache.hadoop.hdfs.server.mover.TestMover
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.757 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.348 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.balancer.TestBalancerWithHANameNodes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.676 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancerWithHANameNodes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.552 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.balancer.TestBalancer
Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 307.638 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.balancer.TestBalancerWithEncryptedTransfer
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.709 sec - in org.apache.hadoop.hdfs.server.balancer.TestBalancerWithEncryptedTransfer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.TestFileContextAcl
Tests run: 64, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.527 sec - in org.apache.hadoop.hdfs.server.namenode.TestFileContextAcl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.TestFsck

Results :

Tests run: 893, Failures: 0, Errors: 0, Skipped: 2

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:04 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  01:09 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.075 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:13 h
[INFO] Finished at: 2016-03-25T22:46:08+00:00
[INFO] Final Memory: 71M/1101M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter2217753940655527824.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire4223302063179276982tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_353985983508810208426tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Hadoop-Hdfs-trunk-Java8 - Build # 1027 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1027/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6009 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:08 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:32 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.088 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:36 h
[INFO] Finished at: 2016-03-26T03:35:20+00:00
[INFO] Final Memory: 58M/490M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHFlush.testHFlushInterrupted

Error Message:
The stream is closed

Stack Trace:
java.io.IOException: The stream is closed
	at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:118)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.DataOutputStream.flush(DataOutputStream.java:123)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.hadoop.hdfs.DataStreamer.closeStream(DataStreamer.java:877)
	at org.apache.hadoop.hdfs.DataStreamer.closeInternal(DataStreamer.java:726)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:721)


FAILED:  org.apache.hadoop.hdfs.TestWriteReadStripedFile.testConcatWithDifferentECPolicy

Error Message:
org/apache/hadoop/ipc/ProtobufHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/ipc/ProtobufHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.concat(ClientNamenodeProtocolTranslatorPB.java:511)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy26.concat(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.concat(DFSClient.java:1504)
	at org.apache.hadoop.hdfs.DistributedFileSystem.concat(DistributedFileSystem.java:611)
	at org.apache.hadoop.hdfs.TestWriteReadStripedFile.testConcatWithDifferentECPolicy(TestWriteReadStripedFile.java:313)



Hadoop-Hdfs-trunk-Java8 - Build # 1028 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1028/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9942 lines...]
main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:21 min]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [  03:44 h]
[INFO] Apache Hadoop HDFS Native Client .................. FAILURE [ 21.188 s]
[INFO] Apache Hadoop HttpFS .............................. FAILURE [03:38 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [04:12 min]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [01:34 min]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.065 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:58 h
[INFO] Finished at: 2016-03-26T08:41:21+00:00
[INFO] Final Memory: 91M/1102M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.4:single (dist) on project hadoop-hdfs-native-client: Error reading assemblies: Error reading descriptor: hadoop-dist: invalid code lengths set -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs-httpfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-httpfs && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire/surefirebooter1493329966080673835.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire/surefire6181007751434640468tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire/surefire_6361454370265391423112tmp
[ERROR] -> [Help 2]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.15:checkstyle (default-cli) on project hadoop-hdfs-bkjournal: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to process configuration file at location: checkstyle/checkstyle.xml: Cannot create file-based resource:invalid code lengths set -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.15:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to process configuration file at location: checkstyle/checkstyle.xml: Cannot create file-based resource:invalid code lengths set -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-native-client
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Hdfs-trunk-Java8 - Build # 1029 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1029/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1136 lines...]
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [ 13.283 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  2.099 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16.401 s
[INFO] Finished at: 2016-03-26T08:45:12+00:00
[INFO] Final Memory: 50M/334M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-hdfs-client: Compilation failure: Compilation failure:
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java:[31,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java:[27,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java:[268,34] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.protocol.datatransfer.Sender
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java:[212,27] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: interface org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtocol
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:[31,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:[497,30] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:[462,7] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:[462,47] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Java8 - Build # 1030 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1030/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 5965 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:03 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:31 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.080 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:36 h
[INFO] Finished at: 2016-03-26T20:22:48+00:00
[INFO] Final Memory: 56M/445M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.hdfs.server.datanode.TestDataNodeLifeline.testNoLifelineSentIfHeartbeatsOnTime

Error Message:
Expect metrics to count no lifeline calls. expected:<0> but was:<1>

Stack Trace:
java.lang.AssertionError: Expect metrics to count no lifeline calls. expected:<0> but was:<1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.hdfs.server.datanode.TestDataNodeLifeline.testNoLifelineSentIfHeartbeatsOnTime(TestDataNodeLifeline.java:256)



Hadoop-Hdfs-trunk-Java8 - Build # 1033 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1033/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6111 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:04 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  04:20 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.092 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:24 h
[INFO] Finished at: 2016-03-28T08:11:44+00:00
[INFO] Final Memory: 56M/419M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000.test1

Error Message:
failed, dn=0, length=-1java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Options$ChecksumOpt
 at org.apache.hadoop.hdfs.client.impl.DfsClientConf.getChecksumOptFromConf(DfsClientConf.java:306)
 at org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:178)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:291)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:275)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:266)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:258)
 at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2466)
 at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2512)
 at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632)
 at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:844)
 at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
 at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.setup(TestDFSStripedOutputStreamWithFailure.java:186)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:321)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test1(TestDFSStripedOutputStreamWithFailure.java:531)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:483)
 at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
 at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
 at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Options$ChecksumOpt
 at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
 ... 25 more


Stack Trace:
java.lang.AssertionError: failed, dn=0, length=-1java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Options$ChecksumOpt
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.getChecksumOptFromConf(DfsClientConf.java:306)
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:178)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:291)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:275)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:266)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:258)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2466)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2512)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:844)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.setup(TestDFSStripedOutputStreamWithFailure.java:186)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:321)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test1(TestDFSStripedOutputStreamWithFailure.java:531)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Options$ChecksumOpt
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 25 more

	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:327)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test1(TestDFSStripedOutputStreamWithFailure.java:531)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)


FAILED:  org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000.test5

Error Message:
failed, dn=0, length=65536java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Options$ChecksumOpt
 at org.apache.hadoop.hdfs.client.impl.DfsClientConf.getChecksumOptFromConf(DfsClientConf.java:306)
 at org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:178)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:291)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:275)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:266)
 at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:258)
 at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2466)
 at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2512)
 at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632)
 at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:844)
 at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
 at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.setup(TestDFSStripedOutputStreamWithFailure.java:186)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:321)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
 at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test5(TestDFSStripedOutputStreamWithFailure.java:535)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:483)
 at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
 at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
 at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)


Stack Trace:
java.lang.AssertionError: failed, dn=0, length=65536java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Options$ChecksumOpt
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.getChecksumOptFromConf(DfsClientConf.java:306)
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:178)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:291)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:275)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:266)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:258)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2466)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2512)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:844)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.setup(TestDFSStripedOutputStreamWithFailure.java:186)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:321)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test5(TestDFSStripedOutputStreamWithFailure.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)

	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:327)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test5(TestDFSStripedOutputStreamWithFailure.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1033

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1033/changes>

Changes:

[jianhe] YARN-4117. End to end unit test with mini YARN cluster for AMRMProxy

------------------------------------------
[...truncated 5918 lines...]
Running org.apache.hadoop.hdfs.protocol.TestAnnotations
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.143 sec - in org.apache.hadoop.hdfs.protocol.TestAnnotations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.TestBlockListAsLongs
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.969 sec - in org.apache.hadoop.hdfs.protocol.TestBlockListAsLongs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.291 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.328 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.736 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSConfigKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.231 sec - in org.apache.hadoop.hdfs.TestDFSConfigKeys
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.77 sec - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.333 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.393 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestReplication
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.932 sec - in org.apache.hadoop.hdfs.TestReplication
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFileChecksum
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.201 sec - in org.apache.hadoop.hdfs.TestFileChecksum
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestRead
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.996 sec - in org.apache.hadoop.hdfs.TestRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestPipelines
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.071 sec - in org.apache.hadoop.hdfs.TestPipelines
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 123.764 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDeprecatedKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.464 sec - in org.apache.hadoop.hdfs.TestDeprecatedKeys
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestAclsEndToEnd
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.92 sec - in org.apache.hadoop.hdfs.TestAclsEndToEnd
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.551 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.349 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.568 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFileAppend
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.499 sec - in org.apache.hadoop.hdfs.TestFileAppend
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.434 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.928 sec - in org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSRollback
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.241 sec - in org.apache.hadoop.hdfs.TestDFSRollback
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.002 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestConnCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.162 sec - in org.apache.hadoop.hdfs.TestConnCache
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestPersistBlocks
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.138 sec - in org.apache.hadoop.hdfs.TestPersistBlocks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestSetrepDecreasing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.988 sec - in org.apache.hadoop.hdfs.TestSetrepDecreasing
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDatanodeLayoutUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.212 sec - in org.apache.hadoop.hdfs.TestDatanodeLayoutUpgrade
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFileCorruption
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.817 sec - in org.apache.hadoop.hdfs.TestFileCorruption
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStartupVersions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.858 sec - in org.apache.hadoop.hdfs.TestDFSStartupVersions
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.413 sec - in org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.43 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS

Results :

Failed tests: 
  TestDFSStripedOutputStreamWithFailure000>TestDFSStripedOutputStreamWithFailure$TestBase.test1:531->TestDFSStripedOutputStreamWithFailure$TestBase.run:527 failed, dn=0, length=-1java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Options$ChecksumOpt
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.getChecksumOptFromConf(DfsClientConf.java:306)
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:178)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:291)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:275)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:266)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:258)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2466)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2512)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:844)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.setup(TestDFSStripedOutputStreamWithFailure.java:186)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:321)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test1(TestDFSStripedOutputStreamWithFailure.java:531)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Options$ChecksumOpt
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 25 more

  TestDFSStripedOutputStreamWithFailure000>TestDFSStripedOutputStreamWithFailure$TestBase.test5:535->TestDFSStripedOutputStreamWithFailure$TestBase.run:527 failed, dn=0, length=65536java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Options$ChecksumOpt
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.getChecksumOptFromConf(DfsClientConf.java:306)
	at org.apache.hadoop.hdfs.client.impl.DfsClientConf.<init>(DfsClientConf.java:178)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:291)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:275)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:266)
	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:258)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2466)
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2512)
	at org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1632)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:844)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.setup(TestDFSStripedOutputStreamWithFailure.java:186)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure.runTest(TestDFSStripedOutputStreamWithFailure.java:321)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.run(TestDFSStripedOutputStreamWithFailure.java:527)
	at org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure$TestBase.test5(TestDFSStripedOutputStreamWithFailure.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)


Tests run: 4349, Failures: 2, Errors: 0, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:04 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  04:20 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.092 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:24 h
[INFO] Finished at: 2016-03-28T08:11:44+00:00
[INFO] Final Memory: 56M/419M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1032

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1032/changes>

Changes:

[kasha] YARN-4805. Don't go through all schedulers in ParameterizedTestBase.

------------------------------------------
[...truncated 5775 lines...]
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 4, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.327 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestLease
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.755 sec - in org.apache.hadoop.hdfs.TestLease
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.413 sec - in org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 14, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 44.424 sec <<< FAILURE! - in org.apache.hadoop.hdfs.TestHFlush
testHFlushInterrupted(org.apache.hadoop.hdfs.TestHFlush)  Time elapsed: 1.679 sec  <<< ERROR!
java.io.IOException: The stream is closed
	at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:118)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.DataOutputStream.flush(DataOutputStream.java:123)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.hadoop.hdfs.DataStreamer.closeStream(DataStreamer.java:877)
	at org.apache.hadoop.hdfs.DataStreamer.closeInternal(DataStreamer.java:726)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:721)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestErasureCodingPolicies
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.464 sec - in org.apache.hadoop.hdfs.TestErasureCodingPolicies
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestRemoteBlockReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.653 sec - in org.apache.hadoop.hdfs.TestRemoteBlockReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestHdfsAdmin
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.099 sec - in org.apache.hadoop.hdfs.TestHdfsAdmin
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDistributedFileSystem
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.652 sec - in org.apache.hadoop.hdfs.TestDistributedFileSystem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.369 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestRollingUpgrade
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 121.214 sec - in org.apache.hadoop.hdfs.TestRollingUpgrade
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDatanodeDeath
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.122 sec - in org.apache.hadoop.hdfs.TestDatanodeDeath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.677 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFsShellPermission
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.597 sec - in org.apache.hadoop.hdfs.TestFsShellPermission
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.TestLocatedBlock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.24 sec - in org.apache.hadoop.hdfs.protocol.TestLocatedBlock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.TestLayoutVersion
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.261 sec - in org.apache.hadoop.hdfs.protocol.TestLayoutVersion
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.datatransfer.sasl.TestSaslDataTransfer
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.283 sec - in org.apache.hadoop.hdfs.protocol.datatransfer.sasl.TestSaslDataTransfer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.datatransfer.TestPacketReceiver
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.443 sec - in org.apache.hadoop.hdfs.protocol.datatransfer.TestPacketReceiver
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.TestAnnotations
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.153 sec - in org.apache.hadoop.hdfs.protocol.TestAnnotations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.protocol.TestBlockListAsLongs
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1 sec - in org.apache.hadoop.hdfs.protocol.TestBlockListAsLongs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.927 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure170
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.292 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure190
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.017 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSConfigKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.196 sec - in org.apache.hadoop.hdfs.TestDFSConfigKeys
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.504 sec - in org.apache.hadoop.hdfs.TestParallelUnixDomainRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.677 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure140
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 94.164 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure100
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestReplication
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.21 sec - in org.apache.hadoop.hdfs.TestReplication
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFileChecksum
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.079 sec - in org.apache.hadoop.hdfs.TestFileChecksum
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestRead
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.868 sec - in org.apache.hadoop.hdfs.TestRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestPipelines
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.523 sec - in org.apache.hadoop.hdfs.TestPipelines
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 121.037 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDeprecatedKeys
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.522 sec - in org.apache.hadoop.hdfs.TestDeprecatedKeys
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestAclsEndToEnd
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.389 sec - in org.apache.hadoop.hdfs.TestAclsEndToEnd
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.666 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.165 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.563 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFileAppend
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.687 sec - in org.apache.hadoop.hdfs.TestFileAppend
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.193 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.888 sec - in org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSRollback
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.578 sec - in org.apache.hadoop.hdfs.TestDFSRollback
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.867 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestConnCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.894 sec - in org.apache.hadoop.hdfs.TestConnCache
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestPersistBlocks
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.361 sec - in org.apache.hadoop.hdfs.TestPersistBlocks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestSetrepDecreasing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.808 sec - in org.apache.hadoop.hdfs.TestSetrepDecreasing
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDatanodeLayoutUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.964 sec - in org.apache.hadoop.hdfs.TestDatanodeLayoutUpgrade
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestFileCorruption
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.626 sec - in org.apache.hadoop.hdfs.TestFileCorruption
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestDFSStartupVersions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.327 sec - in org.apache.hadoop.hdfs.TestDFSStartupVersions
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.834 sec - in org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.024 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS

Results :

Tests in error: 
  TestHFlush.testHFlushInterrupted » IO The stream is closed

Tests run: 4349, Failures: 0, Errors: 1, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:06 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  04:07 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.076 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:11 h
[INFO] Finished at: 2016-03-27T11:43:39+00:00
[INFO] Final Memory: 56M/442M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Hadoop-Hdfs-trunk-Java8 - Build # 1032 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1032/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 5968 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:06 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  04:07 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.076 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:11 h
[INFO] Finished at: 2016-03-27T11:43:39+00:00
[INFO] Final Memory: 56M/442M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHFlush.testHFlushInterrupted

Error Message:
The stream is closed

Stack Trace:
java.io.IOException: The stream is closed
	at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:118)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.DataOutputStream.flush(DataOutputStream.java:123)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.hadoop.hdfs.DataStreamer.closeStream(DataStreamer.java:877)
	at org.apache.hadoop.hdfs.DataStreamer.closeInternal(DataStreamer.java:726)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:721)



Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1031

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1031/changes>

Changes:

[uma.gangumalla] HDFS-9694. Make existing DFSClient#getFileChecksum() work for striped

------------------------------------------
[...truncated 8903 lines...]
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/DFSAdmin$2.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineEditsViewer/OfflineEditsXmlLoader$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/PBImageTextWriter$2.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/PBImageDelimitedTextWriter$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/PBImageXmlWriter$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/OfflineImageReconstructor$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/LsImageVisitor$1.class]]>
  [javadoc] [loading RegularFileObject[/home/jenkins/jenkins-slave/workspace/Hadoop-HdJDiff: finished (took 0s, not including scanning the source files).
  [javadoc] fs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/FileDistributionVisitor$1.class]]
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/FSImageLoader$2.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/WebImageViewer$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/PBImageXmlWriter$2.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/FSImageLoader$3.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/tools/offlineImageViewer/FSImageLoader$1.class]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/util/IOUtilsClient.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/util/CombinedHostsFileWriter.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/util/ByteBufferOutputStream.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/util/ByteArrayManager.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/util/ByteArrayManager$1.class)]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/ReadOnlyList$Util$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/FoldedTreeSet$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/CyclicIteration$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/LightWeightLinkedSet$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/LightWeightHashSet$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/ReadOnlyList$Util$2.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/util/Diff$1.class]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$FsPathOutputStreamRunner.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$FsPathOutputStreamRunner$1.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$AbstractRunner.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$AbstractRunner$1.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$9.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$8.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$7.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$6.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$5.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$4.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$3.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$2.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$16.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$15.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$14.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$13.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$12.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$11.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/WebHdfsFileSystem$10.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/URLConnectionFactory$2.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/URLConnectionFactory$1.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/TokenAspect.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/SWebHdfsFileSystem.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/KerberosUgiAuthenticator.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/KerberosUgiAuthenticator$1.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/JsonUtilClient.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/ByteRangeInputStream.class)]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/web/AuthFilter$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/web/AuthFilter$1$1.class]]>
  [javadoc] [loading RegularFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/classes/org/apache/hadoop/hdfs/web/ParamFilter$1.class]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/ShortParam.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/Param$1.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/LongParam.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/IntegerParam.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/EnumSetParam.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/EnumParam.class)]]>
  [javadoc] [loading ZipFileIndexFileObject[<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-hdfs-client-3.0.0-SNAPSHOT.jar(org/apache/hadoop/hdfs/web/resources/BooleanParam.class)]]>
  [javadoc] [done in 4256 ms]
  [javadoc] Generating Javadoc
  [javadoc] Javadoc execution
  [javadoc] Loading source file <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/dev-support/jdiff/Null.java...>
  [javadoc] Loading source files for package org.apache.hadoop.hdfs...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.client...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.net...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.protocol...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.protocol.datatransfer...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.protocol.datatransfer.sasl...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.protocolPB...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjournal.client...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjournal.protocol...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjournal.protocolPB...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.qjournal.server...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.security.token.block...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.security.token.delegation...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.balancer...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.blockmanagement...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.common...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.erasurecode...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.fsdataset...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.fsdataset.impl...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.web...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.web.dtp...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.datanode.web.webhdfs...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.mover...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.ha...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.snapshot...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.startupprogress...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.top...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.top.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.top.window...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.namenode.web.resources...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.server.protocol...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.erasurecode...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.offlineEditsViewer...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.offlineImageViewer...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.tools.snapshot...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.util...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.web...
  [javadoc] Loading source files for package org.apache.hadoop.hdfs.web.resources...
  [javadoc] Constructing Javadoc information...
  [javadoc] ExcludePrivateAnnotationsJDiffDoclet
  [javadoc] JDiff: doclet started ...
  [javadoc] JDiff: reading the old API in from file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/dev-support/jdiff/Apache_Hadoop_HDFS_2.6.0.xml'...Warning>: API identifier in the XML file (hadoop-hdfs 2.6.0) differs from the name of the file 'Apache_Hadoop_HDFS_2.6.0.xml'
  [javadoc]  finished
  [javadoc] JDiff: reading the new API in from file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/site/jdiff/xml/Apache_Hadoop_HDFS_3.0.0-SNAPSHOT.xml'...> finished
  [javadoc] JDiff: comparing the old and new APIs ...
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.net
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.protocol.datatransfer.sasl
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.qjournal.client
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.qjournal.protocol
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.qjournal.protocolPB
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.server.mover
  [javadoc] Warning: no classes found in the package org.apache.hadoop.hdfs.tools.snapshot
  [javadoc]  Approximately 42% difference between the APIs
  [javadoc] JDiff: reading the comments in from file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/site/jdiff/xml/user_comments_for_Apache_Hadoop_HDFS_2.6.0_to_Apache_Hadoop_HDFS_3.0.0-SNAPSHOT.xml'...>
  [javadoc]  (this will be created)
  [javadoc] JDiff: generating HTML report into the file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/site/jdiff/xml/changes.html'> and the subdirectory '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/site/jdiff/xml/changes'>
  [javadoc] Note: all the comments have been newly generated
  [javadoc] JDiff: writing the comments out to file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/site/jdiff/xml/user_comments_for_Apache_Hadoop_HDFS_2.6.0_to_Apache_Hadoop_HDFS_3.0.0-SNAPSHOT.xml'...>
  [javadoc] JDiff: finished (took 0s).
     [xslt] Processing <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/findbugsXml.xml> to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/site/findbugs.html>
     [xslt] Loading stylesheet /home/jenkins/tools/findbugs/latest/src/xsl/default.xsl
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (pre-dist) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs >>>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ hadoop-hdfs ---
[INFO] No changes detected in protoc files, skipping generation.
[INFO] 
[INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs <<<
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
7 warnings
[WARNING] Javadoc Warnings
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockIdManager.java>:34: warning - Tag @see cannot be used in inline documentation.  It can only be used in the following types of documentation: overview, package, class/interface, constructor, field, method.
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockIdManager.java>:34: warning - Tag @see cannot be used in inline documentation.  It can only be used in the following types of documentation: overview, package, class/interface, constructor, field, method.
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockIdManager.java>:34: warning - Tag @see: reference not found: FSNamesystem
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockIdManager.java>:34: warning - Tag @see: reference not found: EditLog
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockIdManager.java>:34: warning - Tag @see cannot be used in inline documentation.  It can only be used in the following types of documentation: overview, package, class/interface, constructor, field, method.
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/blockmanagement/BlockIdManager.java>:34: warning - Tag @see cannot be used in inline documentation.  It can only be used in the following types of documentation: overview, package, class/interface, constructor, field, method.
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/ShortCircuitRegistry.java>:82: warning - Tag @link: reference not found: DfsClientShmManager
[INFO] 
[INFO] --- maven-assembly-plugin:2.4:single (dist) @ hadoop-hdfs ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:16 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:35 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.103 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:39 h
[INFO] Finished at: 2016-03-27T07:26:47+00:00
[INFO] Final Memory: 60M/463M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.4:single (dist) on project hadoop-hdfs: Error reading assemblies: Error reading descriptor: hadoop-dist: invalid code lengths set -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Hadoop-Hdfs-trunk-Java8 - Build # 1031 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1031/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9096 lines...]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:16 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:35 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.103 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:39 h
[INFO] Finished at: 2016-03-27T07:26:47+00:00
[INFO] Final Memory: 60M/463M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.4:single (dist) on project hadoop-hdfs: Error reading assemblies: Error reading descriptor: hadoop-dist: invalid code lengths set -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1030

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1030/changes>

Changes:

[arp] Revert "HDFS-9694. Make existing DFSClient#getFileChecksum() work for

------------------------------------------
[...truncated 5772 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.848 sec - in org.apache.hadoop.fs.TestUnbuffer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.805 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 10, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 17.654 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.167 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 68, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.926 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.749 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestSWebHdfsFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.871 sec - in org.apache.hadoop.fs.TestSWebHdfsFileContextMainOperations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.185 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestWebHdfsFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.257 sec - in org.apache.hadoop.fs.TestWebHdfsFileContextMainOperations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsWithAcls
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.132 sec - in org.apache.hadoop.fs.viewfs.TestViewFsWithAcls
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.831 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.422 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsWithXAttrs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.914 sec - in org.apache.hadoop.fs.viewfs.TestViewFsWithXAttrs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.113 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.92 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemWithAcls
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.268 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemWithAcls
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.183 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemWithXAttrs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.194 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemWithXAttrs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.617 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 71, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.486 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractDelete
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.532 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractDelete
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSetTimes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.377 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSetTimes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractMkdir
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.644 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractMkdir
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.504 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRootDirectory
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.813 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRootDirectory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractAppend
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.608 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractAppend
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.756 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSeek
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.904 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSeek
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRename
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.739 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRename
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractGetFileStatus
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.373 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractGetFileStatus
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractConcat
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.201 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractConcat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.652 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.84 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.551 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.514 sec - in org.apache.hadoop.TestRefreshCallQueue
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.security.TestPermissionSymlinks
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.707 sec - in org.apache.hadoop.security.TestPermissionSymlinks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.security.TestRefreshUserMappings
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.781 sec - in org.apache.hadoop.security.TestRefreshUserMappings
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.security.TestPermission
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.892 sec - in org.apache.hadoop.security.TestPermission
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tools.TestTools
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.329 sec - in org.apache.hadoop.tools.TestTools
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tools.TestJMXGet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.068 sec - in org.apache.hadoop.tools.TestJMXGet
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tools.TestHdfsConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.563 sec - in org.apache.hadoop.tools.TestHdfsConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.952 sec - in org.apache.hadoop.tracing.TestTracing
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.121 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.953 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.521 sec - in org.apache.hadoop.net.TestNetworkTopology
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.TestGenericRefresh
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.36 sec - in org.apache.hadoop.TestGenericRefresh
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.606 sec - in org.apache.hadoop.cli.TestHDFSCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestCacheAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.116 sec - in org.apache.hadoop.cli.TestCacheAdminCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestAclCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.734 sec - in org.apache.hadoop.cli.TestAclCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.693 sec - in org.apache.hadoop.cli.TestErasureCodingCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestCryptoAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.349 sec - in org.apache.hadoop.cli.TestCryptoAdminCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestXAttrCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.909 sec - in org.apache.hadoop.cli.TestXAttrCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestDeleteCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.58 sec - in org.apache.hadoop.cli.TestDeleteCLI

Results :

Failed tests: 
  TestDataNodeLifeline.testNoLifelineSentIfHeartbeatsOnTime:256 Expect metrics to count no lifeline calls. expected:<0> but was:<1>

Tests run: 4341, Failures: 1, Errors: 0, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:03 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:31 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.080 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:36 h
[INFO] Finished at: 2016-03-26T20:22:48+00:00
[INFO] Final Memory: 56M/445M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1029

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1029/changes>

Changes:

[uma.gangumalla] HDFS-9694. Make existing DFSClient#getFileChecksum() work for striped

------------------------------------------
[...truncated 944 lines...]
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Gridmix ............................. SKIPPED
[INFO] Apache Hadoop Data Join ........................... SKIPPED
[INFO] Apache Hadoop Ant Tasks ........................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED
[INFO] Apache Hadoop Pipes ............................... SKIPPED
[INFO] Apache Hadoop OpenStack support ................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support ......... SKIPPED
[INFO] Apache Hadoop Azure support ....................... SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Kafka Library support ............... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 43.338 s
[INFO] Finished at: 2016-03-26T08:44:43+00:00
[INFO] Final Memory: 86M/582M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-hdfs-client: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java>:[31,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java>:[27,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java>:[268,34] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.protocol.datatransfer.Sender
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java>:[212,27] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: interface org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[31,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[497,30] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[462,7] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[462,47] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-client
+ cd hadoop-hdfs-project
+ /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle findbugs:findbugs -Drequire.test.libhadoop -Pdist -Pnative -Dtar -Pdocs -fae
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop HDFS Client
[INFO] Apache Hadoop HDFS
[INFO] Apache Hadoop HDFS Native Client
[INFO] Apache Hadoop HttpFS
[INFO] Apache Hadoop HDFS BookKeeper Journal
[INFO] Apache Hadoop HDFS-NFS
[INFO] Apache Hadoop HDFS Project
[INFO] 
[INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder with a thread count of 1
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Client 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-client ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-client ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ hadoop-hdfs-client ---
[INFO] Wrote protoc checksums to file <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/hadoop-maven-plugins-protoc-checksums.json>
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-hdfs-client ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-hdfs-client ---
[INFO] Compiling 235 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/classes>
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java>:[40,16] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java>:[57,24] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java>:[59,18] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java>:[61,17] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/shortcircuit/ShortCircuitShm.java>:[63,15] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocolPB/PBHelperClient.java>: Some input files use or override a deprecated API.
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocolPB/PBHelperClient.java>: Recompile with -Xlint:deprecation for details.
[INFO] 7 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java>:[31,39] cannot find symbol
  symbol:   class StripedBlockInfo
  location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java>:[27,39] cannot find symbol
  symbol:   class StripedBlockInfo
  location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java>:[268,34] cannot find symbol
  symbol:   class StripedBlockInfo
  location: class org.apache.hadoop.hdfs.protocol.datatransfer.Sender
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java>:[212,27] cannot find symbol
  symbol:   class StripedBlockInfo
  location: interface org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[31,39] cannot find symbol
  symbol:   class StripedBlockInfo
  location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[497,30] cannot find symbol
  symbol:   class StripedBlockInfo
  location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[462,7] cannot find symbol
  symbol:   class StripedBlockInfo
  location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[462,47] cannot find symbol
  symbol:   class StripedBlockInfo
  location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[INFO] 8 errors 
[INFO] -------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [ 13.283 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  2.099 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16.401 s
[INFO] Finished at: 2016-03-26T08:45:12+00:00
[INFO] Final Memory: 50M/334M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-hdfs-client: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java>:[31,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java>:[27,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/Sender.java>:[268,34] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.protocol.datatransfer.Sender
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/protocol/datatransfer/DataTransferProtocol.java>:[212,27] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: interface org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[31,39] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: package org.apache.hadoop.hdfs.protocol
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[497,30] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[462,7] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java>:[462,47] cannot find symbol
[ERROR] symbol:   class StripedBlockInfo
[ERROR] location: class org.apache.hadoop.hdfs.FileChecksumHelper.StripedFileNonStripedChecksumComputer
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?


Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1028

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1028/changes>

Changes:

[lei] HDFS-9005. Provide support for upgrade domain script. (Ming Ma via Lei

[lei] Add missing files from HDFS-9005. (lei)

------------------------------------------
[...truncated 9749 lines...]
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS-NFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-nfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-nfs ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-hdfs-nfs ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/src/main/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 17 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-hdfs-nfs ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-hdfs-nfs ---
[INFO] Compiling 13 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-hdfs-nfs ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.TestMountd
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.562 sec - in org.apache.hadoop.hdfs.nfs.TestMountd
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestNfs3Utils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.266 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestNfs3Utils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestReaddir
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.936 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestReaddir
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestClientAccessPrivilege
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.063 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestClientAccessPrivilege
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestNfs3HttpServer
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.269 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestNfs3HttpServer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.102 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.361 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestWrites
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.03 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestWrites
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestOpenFileCtxCache
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.02 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestOpenFileCtxCache
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.078 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.nfs.nfs3.TestExportsTable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.508 sec - in org.apache.hadoop.hdfs.nfs.nfs3.TestExportsTable

Results :

Tests run: 49, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-test-sources.jar>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-nfs ---
[INFO] 
Loading source files for package org.apache.hadoop.hdfs.nfs.conf...
Loading source files for package org.apache.hadoop.hdfs.nfs.nfs3...
Loading source files for package org.apache.hadoop.hdfs.nfs.mount...
Constructing Javadoc information...
Standard Doclet version 1.8.0
Building tree for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/NfsConfigKeys.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/NfsConfiguration.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/AsyncDataService.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Metrics.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Utils.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/OffsetRange.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/PrivilegedNfsGatewayStarter.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/RpcProgramNfs3.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/WriteManager.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/WriteManager.MultipleCachedStreamException.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/Mountd.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/RpcProgramMountd.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/overview-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/package-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/package-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/package-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/package-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/package-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/package-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/package-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/package-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/package-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/constant-values.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/serialized-form.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/class-use/NfsConfiguration.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/class-use/NfsConfigKeys.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/OffsetRange.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/AsyncDataService.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/PrivilegedNfsGatewayStarter.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/Nfs3Metrics.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/RpcProgramNfs3.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/Nfs3Utils.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/WriteManager.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/WriteManager.MultipleCachedStreamException.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/class-use/Nfs3.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/class-use/RpcProgramMountd.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/class-use/Mountd.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/conf/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/mount/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/org/apache/hadoop/hdfs/nfs/nfs3/package-use.html...>
Building index for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/overview-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/index-all.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/deprecated-list.html...>
Building index for all classes...
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/allclasses-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/allclasses-noframe.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/index.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/overview-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/help-doc.html...>
5 warnings
[WARNING] Javadoc Warnings
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/src/main/java/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Utils.java>:90: warning: no @param for childNum
[WARNING] public static long getDirSize(int childNum) {
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/src/main/java/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Utils.java>:90: warning: no @return
[WARNING] public static long getDirSize(int childNum) {
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/src/main/java/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Utils.java>:124: warning: no @param for channel
[WARNING] public static void writeChannel(Channel channel, XDR out, int xid) {
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/src/main/java/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Utils.java>:124: warning: no @param for out
[WARNING] public static void writeChannel(Channel channel, XDR out, int xid) {
[WARNING] ^
[WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/src/main/java/org/apache/hadoop/hdfs/nfs/nfs3/Nfs3Utils.java>:124: warning: no @param for xid
[WARNING] public static void writeChannel(Channel channel, XDR out, int xid) {
[WARNING] ^
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-assembly-plugin:2.4:single (dist) @ hadoop-hdfs-nfs ---
[INFO] Reading assembly descriptor: ../../hadoop-assemblies/src/main/resources/assemblies/hadoop-hdfs-nfs-dist.xml
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.hadoop:hadoop-common'
o  'org.slf4j:slf4j-api'
o  'org.slf4j:slf4j-log4j12'
o  'org.hsqldb:hsqldb'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-nfs ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-nfs ---
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:21 min]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [  03:44 h]
[INFO] Apache Hadoop HDFS Native Client .................. FAILURE [ 21.188 s]
[INFO] Apache Hadoop HttpFS .............................. FAILURE [03:38 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. FAILURE [04:12 min]
[INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [01:34 min]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.065 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:58 h
[INFO] Finished at: 2016-03-26T08:41:21+00:00
[INFO] Final Memory: 91M/1102M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:2.4:single (dist) on project hadoop-hdfs-native-client: Error reading assemblies: Error reading descriptor: hadoop-dist: invalid code lengths set -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs-httpfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-httpfs> && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire/surefirebooter1493329966080673835.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire/surefire6181007751434640468tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire/surefire_6361454370265391423112tmp>
[ERROR] -> [Help 2]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.15:checkstyle (default-cli) on project hadoop-hdfs-bkjournal: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to process configuration file at location: checkstyle/checkstyle.xml: Cannot create file-based resource:invalid code lengths set -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.15:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to process configuration file at location: checkstyle/checkstyle.xml: Cannot create file-based resource:invalid code lengths set -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-native-client
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #1027

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1027/changes>

Changes:

[wang] HADOOP-12962. KMS key names are incorrectly encoded when creating key.

[Arun Suresh] YARN-4823. Refactor the nested reservation id field in listReservation

[jlowe] HADOOP-12958. PhantomReference for filesystem statistics can trigger

------------------------------------------
[...truncated 5816 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.276 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 10, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 18.805 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.193 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 68, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.49 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.403 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestSWebHdfsFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.091 sec - in org.apache.hadoop.fs.TestSWebHdfsFileContextMainOperations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.279 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestWebHdfsFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.797 sec - in org.apache.hadoop.fs.TestWebHdfsFileContextMainOperations
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsWithAcls
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.168 sec - in org.apache.hadoop.fs.viewfs.TestViewFsWithAcls
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.245 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.711 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsWithXAttrs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.44 sec - in org.apache.hadoop.fs.viewfs.TestViewFsWithXAttrs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.183 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 58, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.925 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemWithAcls
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.351 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemWithAcls
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.46 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemWithXAttrs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.437 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemWithXAttrs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.815 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 71, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.729 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractDelete
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.95 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractDelete
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSetTimes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.744 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSetTimes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractMkdir
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.692 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractMkdir
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.333 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractOpen
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRootDirectory
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.552 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRootDirectory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractAppend
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.437 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractAppend
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.562 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractCreate
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSeek
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.11 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractSeek
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRename
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.159 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractRename
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractGetFileStatus
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.795 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractGetFileStatus
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.contract.hdfs.TestHDFSContractConcat
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.281 sec - in org.apache.hadoop.fs.contract.hdfs.TestHDFSContractConcat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.609 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.38 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.469 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.996 sec - in org.apache.hadoop.TestRefreshCallQueue
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.security.TestPermissionSymlinks
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.519 sec - in org.apache.hadoop.security.TestPermissionSymlinks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.security.TestRefreshUserMappings
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.42 sec - in org.apache.hadoop.security.TestRefreshUserMappings
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.security.TestPermission
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.46 sec - in org.apache.hadoop.security.TestPermission
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tools.TestTools
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.416 sec - in org.apache.hadoop.tools.TestTools
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tools.TestJMXGet
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.469 sec - in org.apache.hadoop.tools.TestJMXGet
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tools.TestHdfsConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.611 sec - in org.apache.hadoop.tools.TestHdfsConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.455 sec - in org.apache.hadoop.tracing.TestTracing
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.228 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.21 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.065 sec - in org.apache.hadoop.net.TestNetworkTopology
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.TestGenericRefresh
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.247 sec - in org.apache.hadoop.TestGenericRefresh
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 93.163 sec - in org.apache.hadoop.cli.TestHDFSCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestCacheAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.425 sec - in org.apache.hadoop.cli.TestCacheAdminCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestAclCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.056 sec - in org.apache.hadoop.cli.TestAclCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.365 sec - in org.apache.hadoop.cli.TestErasureCodingCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestCryptoAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.345 sec - in org.apache.hadoop.cli.TestCryptoAdminCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestXAttrCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.481 sec - in org.apache.hadoop.cli.TestXAttrCLI
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.cli.TestDeleteCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.803 sec - in org.apache.hadoop.cli.TestDeleteCLI

Results :

Tests in error: 
  TestHFlush.testHFlushInterrupted » IO The stream is closed
  TestWriteReadStripedFile.testConcatWithDifferentECPolicy:313 » NoClassDefFound

Tests run: 4335, Failures: 0, Errors: 2, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:08 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:32 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.088 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:36 h
[INFO] Finished at: 2016-03-26T03:35:20+00:00
[INFO] Final Memory: 58M/490M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results