You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/05/16 22:21:20 UTC

Build failed in Jenkins: Hadoop-Hdfs-trunk #3150

See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/3150/changes>

Changes:

[epayne] YARN-5069. TestFifoScheduler.testResourceOverCommit race condition.

------------------------------------------
[...truncated 5206 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.193 sec - in org.apache.hadoop.hdfs.TestDFSOutputStream
Running org.apache.hadoop.hdfs.TestSafeModeWithStripedFile
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.481 sec - in org.apache.hadoop.hdfs.TestFileCreationClient
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.124 sec - in org.apache.hadoop.hdfs.TestSeekBug
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.783 sec - in org.apache.hadoop.hdfs.TestSafeModeWithStripedFile
Running org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestDistributedFileSystem
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.128 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure090
Running org.apache.hadoop.hdfs.security.TestDelegationToken
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 90.925 sec - in org.apache.hadoop.hdfs.TestFileAppend3
Running org.apache.hadoop.hdfs.security.token.block.TestBlockToken
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.566 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.security.TestDelegationTokenForProxyUser
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.738 sec - in org.apache.hadoop.hdfs.security.token.block.TestBlockToken
Running org.apache.hadoop.hdfs.security.TestClientProtocolWithDelegationToken
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.764 sec - in org.apache.hadoop.hdfs.security.TestDelegationTokenForProxyUser
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.919 sec - in org.apache.hadoop.hdfs.security.TestClientProtocolWithDelegationToken
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.crypto.TestHdfsCryptoStreams
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.606 sec - in org.apache.hadoop.hdfs.security.TestDelegationToken
Running org.apache.hadoop.hdfs.TestWriteBlockGetsBlockLengthHint
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.457 sec - in org.apache.hadoop.hdfs.TestDistributedFileSystem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.097 sec - in org.apache.hadoop.hdfs.TestWriteBlockGetsBlockLengthHint
Running org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.132 sec - in org.apache.hadoop.hdfs.crypto.TestHdfsCryptoStreams
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.421 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070
Running org.apache.hadoop.hdfs.TestAclsEndToEnd
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.403 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.914 sec - in org.apache.hadoop.hdfs.TestErasureCodingPolicyWithSnapshot
Running org.apache.hadoop.hdfs.TestFileStatus
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 46.648 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure110
Running org.apache.hadoop.hdfs.TestAsyncDFSRename
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.983 sec - in org.apache.hadoop.hdfs.TestFileStatus
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.481 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.805 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestPread
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.427 sec - in org.apache.hadoop.hdfs.TestAclsEndToEnd
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.701 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 101.882 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure020
Running org.apache.hadoop.hdfs.TestGetFileChecksum
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.521 sec - in org.apache.hadoop.hdfs.TestGetFileChecksum
Running org.apache.hadoop.hdfs.TestLeaseRecovery2
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.004 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 99.541 sec - in org.apache.hadoop.hdfs.TestPread
Running org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.04 sec - in org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
Running org.apache.hadoop.hdfs.TestRollingUpgrade
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.029 sec - in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure000
Running org.apache.hadoop.hdfs.TestReservedRawPaths
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.88 sec - in org.apache.hadoop.hdfs.TestReservedRawPaths
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.645 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.538 sec - in org.apache.hadoop.hdfs.TestLeaseRecovery2
Running org.apache.hadoop.hdfs.tools.TestDFSHAAdminMiniCluster
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 183.165 sec - in org.apache.hadoop.hdfs.TestAsyncDFSRename
Running org.apache.hadoop.hdfs.tools.TestGetConf
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.981 sec - in org.apache.hadoop.hdfs.tools.TestGetConf
Running org.apache.hadoop.hdfs.tools.TestStoragePolicyCommands
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.424 sec - in org.apache.hadoop.hdfs.tools.TestDFSHAAdminMiniCluster
Running org.apache.hadoop.hdfs.tools.TestDFSZKFailoverController
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.386 sec - in org.apache.hadoop.hdfs.tools.TestStoragePolicyCommands
Running org.apache.hadoop.hdfs.tools.TestDFSAdmin
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.267 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForXAttr
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.179 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForXAttr
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.193 sec - in org.apache.hadoop.hdfs.tools.TestDFSAdmin
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerWithStripedBlocks
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.386 sec - in org.apache.hadoop.hdfs.tools.TestDFSZKFailoverController
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForContentSummary
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.845 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer
Running org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForAcl
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.357 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerWithStripedBlocks
Running org.apache.hadoop.hdfs.tools.TestDFSHAAdmin
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.414 sec - in org.apache.hadoop.hdfs.tools.TestDFSHAAdmin
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForContentSummary
Running org.apache.hadoop.hdfs.tools.TestDelegationTokenFetcher
Running org.apache.hadoop.hdfs.tools.TestDebugAdmin
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.765 sec - in org.apache.hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewerForAcl
Running org.apache.hadoop.hdfs.tools.TestGetGroups
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.686 sec - in org.apache.hadoop.hdfs.tools.TestDelegationTokenFetcher
Running org.apache.hadoop.hdfs.tools.TestDFSAdminWithHA
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.352 sec - in org.apache.hadoop.hdfs.tools.TestDebugAdmin
Running org.apache.hadoop.hdfs.tools.offlineEditsViewer.TestOfflineEditsViewer
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.544 sec - in org.apache.hadoop.hdfs.tools.TestGetGroups
Running org.apache.hadoop.hdfs.TestHDFSFileSystemContract
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.079 sec - in org.apache.hadoop.hdfs.tools.offlineEditsViewer.TestOfflineEditsViewer
Running org.apache.hadoop.hdfs.TestClose
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.46 sec - in org.apache.hadoop.hdfs.tools.TestDFSAdminWithHA
Running org.apache.hadoop.hdfs.TestFetchImage
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.901 sec - in org.apache.hadoop.hdfs.TestClose
Running org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.894 sec - in org.apache.hadoop.hdfs.TestFetchImage
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 114.611 sec - in org.apache.hadoop.hdfs.TestRollingUpgrade
Running org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.097 sec - in org.apache.hadoop.hdfs.protocolPB.TestPBHelper
Running org.apache.hadoop.hdfs.TestFileAppend
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.41 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.385 sec - in org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
Running org.apache.hadoop.hdfs.TestLease
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.06 sec - in org.apache.hadoop.hdfs.TestRollingUpgradeRollback
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.213 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.TestGenericRefresh
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.542 sec - in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.849 sec - in org.apache.hadoop.TestGenericRefresh
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.385 sec - in org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.023 sec - in org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.268 sec - in org.apache.hadoop.hdfs.TestFileAppend
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 44, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.857 sec - in org.apache.hadoop.hdfs.TestHDFSFileSystemContract
Running org.apache.hadoop.cli.TestDeleteCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.04 sec - in org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.538 sec - in org.apache.hadoop.cli.TestDeleteCLI
Running org.apache.hadoop.cli.TestAclCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.082 sec - in org.apache.hadoop.cli.TestAclCLI
Running org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Running org.apache.hadoop.cli.TestCacheAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.512 sec - in org.apache.hadoop.cli.TestErasureCodingCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.082 sec - in org.apache.hadoop.cli.TestCacheAdminCLI
Running org.apache.hadoop.cli.TestXAttrCLI
Running org.apache.hadoop.cli.TestCryptoAdminCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.524 sec - in org.apache.hadoop.cli.TestXAttrCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.591 sec - in org.apache.hadoop.cli.TestCryptoAdminCLI
Running org.apache.hadoop.security.TestPermission
Running org.apache.hadoop.security.TestRefreshUserMappings
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.619 sec - in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Running org.apache.hadoop.security.TestPermissionSymlinks
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.21 sec - in org.apache.hadoop.security.TestRefreshUserMappings
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.232 sec - in org.apache.hadoop.security.TestPermission
Running org.apache.hadoop.tools.TestTools
Running org.apache.hadoop.tools.TestHdfsConfigFields
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.431 sec - in org.apache.hadoop.tools.TestHdfsConfigFields
Running org.apache.hadoop.tools.TestJMXGet
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.013 sec - in org.apache.hadoop.tools.TestTools
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.609 sec - in org.apache.hadoop.security.TestPermissionSymlinks
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.555 sec - in org.apache.hadoop.tools.TestJMXGet
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 65.534 sec - in org.apache.hadoop.cli.TestHDFSCLI

Results :

Tests in error: 
  TestDiskspaceQuotaUpdate.testUpdateQuotaForFSync » IO Failed to replace a bad ...
  TestDiskspaceQuotaUpdate.testUpdateQuotaForAppend » IO Failed to replace a bad...

Tests run: 4418, Failures: 0, Errors: 2, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Skipping javadoc generation
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:02 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [56:04 min]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.140 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:00 h
[INFO] Finished at: 2016-05-16T22:20:49+00:00
[INFO] Final Memory: 57M/624M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -DminiClusterDedicatedDirs=true -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter7514952471087814996.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire6774934879911219787tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_214264296991204908227tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Hadoop-Hdfs-trunk - Build # 3151 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/3151/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 349 lines...]
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.7.0-55 is not in the allowed range [1.8,).
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [  1.749 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ FAILURE [  0.523 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.676 s
[INFO] Finished at: 2016-05-16T23:06:53+00:00
[INFO] Final Memory: 25M/723M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) on project hadoop-hdfs-client: Execution compile-protoc of goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc failed: Unable to load the mojo 'protoc' in the plugin 'org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
[ERROR] -----------------------------------------------------
[ERROR] realm =    plugin>org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/3.0.0-alpha1-SNAPSHOT/hadoop-maven-plugins-3.0.0-alpha1-SNAPSHOT.jar
[ERROR] urls[1] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-inject-bean/1.4.2/sisu-inject-bean-1.4.2.jar
[ERROR] urls[2] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar
[ERROR] urls[3] = file:/home/jenkins/.m2/repository/org/sonatype/aether/aether-util/1.7/aether-util-1.7.jar
[ERROR] urls[4] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.14/plexus-interpolation-1.14.jar
[ERROR] urls[5] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-utils/2.0.5/plexus-utils-2.0.5.jar
[ERROR] urls[6] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
[ERROR] urls[7] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar
[ERROR] urls[8] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-cipher/1.4/plexus-cipher-1.4.jar
[ERROR] urls[9] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
[ERROR] urls[10] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import  from realm ClassRealm[project>org.apache.hadoop:hadoop-hdfs-project:3.0.0-alpha1-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR] 
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (dist-enforce) on project hadoop-hdfs-project: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 3153 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/3153/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 349 lines...]
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.7.0-55 is not in the allowed range [1.8,).
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [  1.647 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ FAILURE [  0.586 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.685 s
[INFO] Finished at: 2016-05-17T01:44:44+00:00
[INFO] Final Memory: 25M/723M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) on project hadoop-hdfs-client: Execution compile-protoc of goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc failed: Unable to load the mojo 'protoc' in the plugin 'org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
[ERROR] -----------------------------------------------------
[ERROR] realm =    plugin>org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/3.0.0-alpha1-SNAPSHOT/hadoop-maven-plugins-3.0.0-alpha1-SNAPSHOT.jar
[ERROR] urls[1] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-inject-bean/1.4.2/sisu-inject-bean-1.4.2.jar
[ERROR] urls[2] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar
[ERROR] urls[3] = file:/home/jenkins/.m2/repository/org/sonatype/aether/aether-util/1.7/aether-util-1.7.jar
[ERROR] urls[4] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.14/plexus-interpolation-1.14.jar
[ERROR] urls[5] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-utils/2.0.5/plexus-utils-2.0.5.jar
[ERROR] urls[6] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
[ERROR] urls[7] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar
[ERROR] urls[8] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-cipher/1.4/plexus-cipher-1.4.jar
[ERROR] urls[9] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
[ERROR] urls[10] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import  from realm ClassRealm[project>org.apache.hadoop:hadoop-hdfs-project:3.0.0-alpha1-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR] 
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (dist-enforce) on project hadoop-hdfs-project: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #3153

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/3153/changes>

Changes:

[aw] HADOOP-12930. Dynamic subcommands for hadoop shell scripts (aw)

------------------------------------------
[...truncated 157 lines...]
[INFO] Apache Hadoop YARN Server ......................... SKIPPED
[INFO] Apache Hadoop YARN Server Common .................. SKIPPED
[INFO] Apache Hadoop YARN NodeManager .................... SKIPPED
[INFO] Apache Hadoop YARN Web Proxy ...................... SKIPPED
[INFO] Apache Hadoop YARN ApplicationHistoryService ...... SKIPPED
[INFO] Apache Hadoop YARN ResourceManager ................ SKIPPED
[INFO] Apache Hadoop YARN Server Tests ................... SKIPPED
[INFO] Apache Hadoop YARN Client ......................... SKIPPED
[INFO] Apache Hadoop YARN SharedCacheManager ............. SKIPPED
[INFO] Apache Hadoop YARN Timeline Plugin Storage ........ SKIPPED
[INFO] Apache Hadoop YARN Applications ................... SKIPPED
[INFO] Apache Hadoop YARN DistributedShell ............... SKIPPED
[INFO] Apache Hadoop YARN Unmanaged Am Launcher .......... SKIPPED
[INFO] Apache Hadoop YARN Site ........................... SKIPPED
[INFO] Apache Hadoop YARN Registry ....................... SKIPPED
[INFO] Apache Hadoop YARN Project ........................ SKIPPED
[INFO] Apache Hadoop MapReduce Client .................... SKIPPED
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED
[INFO] Apache Hadoop Archives ............................ SKIPPED
[INFO] Apache Hadoop Archive Logs ........................ SKIPPED
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Gridmix ............................. SKIPPED
[INFO] Apache Hadoop Data Join ........................... SKIPPED
[INFO] Apache Hadoop Ant Tasks ........................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED
[INFO] Apache Hadoop Pipes ............................... SKIPPED
[INFO] Apache Hadoop OpenStack support ................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support ......... SKIPPED
[INFO] Apache Hadoop Azure support ....................... SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Kafka Library support ............... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.819 s
[INFO] Finished at: 2016-05-17T01:44:39+00:00
[INFO] Final Memory: 38M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (clean) on project hadoop-main: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ cd hadoop-hdfs-project
+ /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle findbugs:findbugs -Drequire.test.libhadoop -Pdist,docs,native,parallel-tests -Dtar -fae -Dmaven.javadoc.skip=true
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop HDFS Client
[INFO] Apache Hadoop HDFS
[INFO] Apache Hadoop HDFS Native Client
[INFO] Apache Hadoop HttpFS
[INFO] Apache Hadoop HDFS BookKeeper Journal
[INFO] Apache Hadoop HDFS-NFS
[INFO] Apache Hadoop HDFS Project
[INFO] 
[INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder with a thread count of 1
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Client 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-client ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-client ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) @ hadoop-hdfs-client ---
[WARNING] Error injecting: org.apache.hadoop.maven.plugin.protoc.ProtocMojo
java.lang.TypeNotPresentException: Type org.apache.hadoop.maven.plugin.protoc.ProtocMojo not present
	at org.eclipse.sisu.space.URLClassSpace.loadClass(URLClassSpace.java:115)
	at org.eclipse.sisu.space.NamedClass.load(NamedClass.java:46)
	at org.eclipse.sisu.space.AbstractDeferredClass.get(AbstractDeferredClass.java:48)
	at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision(InternalFactoryToInitializableAdapter.java:55)
	at com.google.inject.internal.ProviderInternalFactory$1.call(ProviderInternalFactory.java:70)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:100)
	at org.eclipse.sisu.plexus.PlexusLifecycleManager.onProvision(PlexusLifecycleManager.java:133)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:109)
	at com.google.inject.internal.ProvisionListenerStackCallback.provision(ProvisionListenerStackCallback.java:55)
	at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:68)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.get(InternalFactoryToInitializableAdapter.java:47)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:997)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1047)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:993)
	at com.google.inject.Scopes$1$1.get(Scopes.java:59)
	at org.eclipse.sisu.inject.LazyBeanEntry.getValue(LazyBeanEntry.java:82)
	at org.eclipse.sisu.plexus.LazyPlexusBean.getValue(LazyPlexusBean.java:51)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:260)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:252)
	at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getConfiguredMojo(DefaultMavenPluginManager.java:462)
	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:120)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:108)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:76)
	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:116)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:361)
	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
	at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.lang.UnsupportedClassVersionError: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClassFromSelf(ClassRealm.java:389)
	at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:42)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:259)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:235)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:227)
	at org.eclipse.sisu.space.URLClassSpace.loadClass(URLClassSpace.java:107)
	... 41 more
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.7.0-55 is not in the allowed range [1.8,).
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [  1.647 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ FAILURE [  0.586 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.685 s
[INFO] Finished at: 2016-05-17T01:44:44+00:00
[INFO] Final Memory: 25M/723M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) on project hadoop-hdfs-client: Execution compile-protoc of goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc failed: Unable to load the mojo 'protoc' in the plugin 'org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
[ERROR] -----------------------------------------------------
[ERROR] realm =    plugin>org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/3.0.0-alpha1-SNAPSHOT/hadoop-maven-plugins-3.0.0-alpha1-SNAPSHOT.jar
[ERROR] urls[1] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-inject-bean/1.4.2/sisu-inject-bean-1.4.2.jar
[ERROR] urls[2] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar
[ERROR] urls[3] = file:/home/jenkins/.m2/repository/org/sonatype/aether/aether-util/1.7/aether-util-1.7.jar
[ERROR] urls[4] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.14/plexus-interpolation-1.14.jar
[ERROR] urls[5] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-utils/2.0.5/plexus-utils-2.0.5.jar
[ERROR] urls[6] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
[ERROR] urls[7] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar
[ERROR] urls[8] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-cipher/1.4/plexus-cipher-1.4.jar
[ERROR] urls[9] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
[ERROR] urls[10] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import  from realm ClassRealm[project>org.apache.hadoop:hadoop-hdfs-project:3.0.0-alpha1-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR] 
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (dist-enforce) on project hadoop-hdfs-project: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?


---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Build failed in Jenkins: Hadoop-Hdfs-trunk #3152

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/3152/changes>

Changes:

[lei] HDFS-10410. RedundantEditLogInputStream.LOG is set to wrong class. (John

------------------------------------------
[...truncated 173 lines...]
[INFO] Apache Hadoop MapReduce Client .................... SKIPPED
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED
[INFO] Apache Hadoop Archives ............................ SKIPPED
[INFO] Apache Hadoop Archive Logs ........................ SKIPPED
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Gridmix ............................. SKIPPED
[INFO] Apache Hadoop Data Join ........................... SKIPPED
[INFO] Apache Hadoop Ant Tasks ........................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED
[INFO] Apache Hadoop Pipes ............................... SKIPPED
[INFO] Apache Hadoop OpenStack support ................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support ......... SKIPPED
[INFO] Apache Hadoop Azure support ....................... SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Kafka Library support ............... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.797 s
[INFO] Finished at: 2016-05-17T00:56:19+00:00
[INFO] Final Memory: 38M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (clean) on project hadoop-main: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ cd hadoop-hdfs-project
+ /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle findbugs:findbugs -Drequire.test.libhadoop -Pdist,docs,native,parallel-tests -Dtar -fae -Dmaven.javadoc.skip=true
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop HDFS Client
[INFO] Apache Hadoop HDFS
[INFO] Apache Hadoop HDFS Native Client
[INFO] Apache Hadoop HttpFS
[INFO] Apache Hadoop HDFS BookKeeper Journal
[INFO] Apache Hadoop HDFS-NFS
[INFO] Apache Hadoop HDFS Project
[INFO] 
[INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder with a thread count of 1
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Client 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml
2/2 KB   
         
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml (2 KB at 2.4 KB/sec)
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-project-dist/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml
849/849 B   
            
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-project-dist/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml (849 B at 7.7 KB/sec)
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-annotations/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml
809/809 B   
            
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-annotations/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml (809 B at 7.5 KB/sec)
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-auth/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml
2/2 KB      
         
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-auth/3.0.0-alpha1-SNAPSHOT/maven-metadata.xml (2 KB at 11.6 KB/sec)
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-client ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-client ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) @ hadoop-hdfs-client ---
[WARNING] Error injecting: org.apache.hadoop.maven.plugin.protoc.ProtocMojo
java.lang.TypeNotPresentException: Type org.apache.hadoop.maven.plugin.protoc.ProtocMojo not present
	at org.eclipse.sisu.space.URLClassSpace.loadClass(URLClassSpace.java:115)
	at org.eclipse.sisu.space.NamedClass.load(NamedClass.java:46)
	at org.eclipse.sisu.space.AbstractDeferredClass.get(AbstractDeferredClass.java:48)
	at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision(InternalFactoryToInitializableAdapter.java:55)
	at com.google.inject.internal.ProviderInternalFactory$1.call(ProviderInternalFactory.java:70)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:100)
	at org.eclipse.sisu.plexus.PlexusLifecycleManager.onProvision(PlexusLifecycleManager.java:133)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:109)
	at com.google.inject.internal.ProvisionListenerStackCallback.provision(ProvisionListenerStackCallback.java:55)
	at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:68)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.get(InternalFactoryToInitializableAdapter.java:47)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:997)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1047)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:993)
	at com.google.inject.Scopes$1$1.get(Scopes.java:59)
	at org.eclipse.sisu.inject.LazyBeanEntry.getValue(LazyBeanEntry.java:82)
	at org.eclipse.sisu.plexus.LazyPlexusBean.getValue(LazyPlexusBean.java:51)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:260)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:252)
	at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getConfiguredMojo(DefaultMavenPluginManager.java:462)
	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:120)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:108)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:76)
	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:116)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:361)
	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
	at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.lang.UnsupportedClassVersionError: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClassFromSelf(ClassRealm.java:389)
	at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:42)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:259)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:235)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:227)
	at org.eclipse.sisu.space.URLClassSpace.loadClass(URLClassSpace.java:107)
	... 41 more
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.7.0-55 is not in the allowed range [1.8,).
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [  2.596 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ FAILURE [  0.443 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.452 s
[INFO] Finished at: 2016-05-17T00:56:25+00:00
[INFO] Final Memory: 32M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) on project hadoop-hdfs-client: Execution compile-protoc of goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc failed: Unable to load the mojo 'protoc' in the plugin 'org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
[ERROR] -----------------------------------------------------
[ERROR] realm =    plugin>org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/3.0.0-alpha1-SNAPSHOT/hadoop-maven-plugins-3.0.0-alpha1-SNAPSHOT.jar
[ERROR] urls[1] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-inject-bean/1.4.2/sisu-inject-bean-1.4.2.jar
[ERROR] urls[2] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar
[ERROR] urls[3] = file:/home/jenkins/.m2/repository/org/sonatype/aether/aether-util/1.7/aether-util-1.7.jar
[ERROR] urls[4] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.14/plexus-interpolation-1.14.jar
[ERROR] urls[5] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-utils/2.0.5/plexus-utils-2.0.5.jar
[ERROR] urls[6] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
[ERROR] urls[7] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar
[ERROR] urls[8] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-cipher/1.4/plexus-cipher-1.4.jar
[ERROR] urls[9] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
[ERROR] urls[10] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import  from realm ClassRealm[project>org.apache.hadoop:hadoop-hdfs-project:3.0.0-alpha1-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR] 
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (dist-enforce) on project hadoop-hdfs-project: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?


---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org


Hadoop-Hdfs-trunk - Build # 3152 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/3152/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 365 lines...]
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.7.0-55 is not in the allowed range [1.8,).
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [  2.596 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ FAILURE [  0.443 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.452 s
[INFO] Finished at: 2016-05-17T00:56:25+00:00
[INFO] Final Memory: 32M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) on project hadoop-hdfs-client: Execution compile-protoc of goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc failed: Unable to load the mojo 'protoc' in the plugin 'org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
[ERROR] -----------------------------------------------------
[ERROR] realm =    plugin>org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/3.0.0-alpha1-SNAPSHOT/hadoop-maven-plugins-3.0.0-alpha1-SNAPSHOT.jar
[ERROR] urls[1] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-inject-bean/1.4.2/sisu-inject-bean-1.4.2.jar
[ERROR] urls[2] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar
[ERROR] urls[3] = file:/home/jenkins/.m2/repository/org/sonatype/aether/aether-util/1.7/aether-util-1.7.jar
[ERROR] urls[4] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.14/plexus-interpolation-1.14.jar
[ERROR] urls[5] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-utils/2.0.5/plexus-utils-2.0.5.jar
[ERROR] urls[6] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
[ERROR] urls[7] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar
[ERROR] urls[8] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-cipher/1.4/plexus-cipher-1.4.jar
[ERROR] urls[9] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
[ERROR] urls[10] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import  from realm ClassRealm[project>org.apache.hadoop:hadoop-hdfs-project:3.0.0-alpha1-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR] 
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (dist-enforce) on project hadoop-hdfs-project: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #3151

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/3151/changes>

Changes:

[jing9] HADOOP-13146. Refactor RetryInvocationHandler. Contributed by Tsz Wo

[wang] HADOOP-11858. [JDK8] Set minimum version of Hadoop 3 to JDK 8.

------------------------------------------
[...truncated 157 lines...]
[INFO] Apache Hadoop YARN Server ......................... SKIPPED
[INFO] Apache Hadoop YARN Server Common .................. SKIPPED
[INFO] Apache Hadoop YARN NodeManager .................... SKIPPED
[INFO] Apache Hadoop YARN Web Proxy ...................... SKIPPED
[INFO] Apache Hadoop YARN ApplicationHistoryService ...... SKIPPED
[INFO] Apache Hadoop YARN ResourceManager ................ SKIPPED
[INFO] Apache Hadoop YARN Server Tests ................... SKIPPED
[INFO] Apache Hadoop YARN Client ......................... SKIPPED
[INFO] Apache Hadoop YARN SharedCacheManager ............. SKIPPED
[INFO] Apache Hadoop YARN Timeline Plugin Storage ........ SKIPPED
[INFO] Apache Hadoop YARN Applications ................... SKIPPED
[INFO] Apache Hadoop YARN DistributedShell ............... SKIPPED
[INFO] Apache Hadoop YARN Unmanaged Am Launcher .......... SKIPPED
[INFO] Apache Hadoop YARN Site ........................... SKIPPED
[INFO] Apache Hadoop YARN Registry ....................... SKIPPED
[INFO] Apache Hadoop YARN Project ........................ SKIPPED
[INFO] Apache Hadoop MapReduce Client .................... SKIPPED
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED
[INFO] Apache Hadoop Archives ............................ SKIPPED
[INFO] Apache Hadoop Archive Logs ........................ SKIPPED
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Gridmix ............................. SKIPPED
[INFO] Apache Hadoop Data Join ........................... SKIPPED
[INFO] Apache Hadoop Ant Tasks ........................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED
[INFO] Apache Hadoop Pipes ............................... SKIPPED
[INFO] Apache Hadoop OpenStack support ................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support ......... SKIPPED
[INFO] Apache Hadoop Azure support ....................... SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............ SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Kafka Library support ............... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.830 s
[INFO] Finished at: 2016-05-16T23:06:48+00:00
[INFO] Final Memory: 38M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (clean) on project hadoop-main: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ cd hadoop-hdfs-project
+ /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle findbugs:findbugs -Drequire.test.libhadoop -Pdist,docs,native,parallel-tests -Dtar -fae -Dmaven.javadoc.skip=true
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop HDFS Client
[INFO] Apache Hadoop HDFS
[INFO] Apache Hadoop HDFS Native Client
[INFO] Apache Hadoop HttpFS
[INFO] Apache Hadoop HDFS BookKeeper Journal
[INFO] Apache Hadoop HDFS-NFS
[INFO] Apache Hadoop HDFS Project
[INFO] 
[INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder with a thread count of 1
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Client 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-client ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-client ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs-client/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) @ hadoop-hdfs-client ---
[WARNING] Error injecting: org.apache.hadoop.maven.plugin.protoc.ProtocMojo
java.lang.TypeNotPresentException: Type org.apache.hadoop.maven.plugin.protoc.ProtocMojo not present
	at org.eclipse.sisu.space.URLClassSpace.loadClass(URLClassSpace.java:115)
	at org.eclipse.sisu.space.NamedClass.load(NamedClass.java:46)
	at org.eclipse.sisu.space.AbstractDeferredClass.get(AbstractDeferredClass.java:48)
	at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:86)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision(InternalFactoryToInitializableAdapter.java:55)
	at com.google.inject.internal.ProviderInternalFactory$1.call(ProviderInternalFactory.java:70)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:100)
	at org.eclipse.sisu.plexus.PlexusLifecycleManager.onProvision(PlexusLifecycleManager.java:133)
	at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:109)
	at com.google.inject.internal.ProvisionListenerStackCallback.provision(ProvisionListenerStackCallback.java:55)
	at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:68)
	at com.google.inject.internal.InternalFactoryToInitializableAdapter.get(InternalFactoryToInitializableAdapter.java:47)
	at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:997)
	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1047)
	at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:993)
	at com.google.inject.Scopes$1$1.get(Scopes.java:59)
	at org.eclipse.sisu.inject.LazyBeanEntry.getValue(LazyBeanEntry.java:82)
	at org.eclipse.sisu.plexus.LazyPlexusBean.getValue(LazyPlexusBean.java:51)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:260)
	at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:252)
	at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getConfiguredMojo(DefaultMavenPluginManager.java:462)
	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:120)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:108)
	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:76)
	at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:116)
	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:361)
	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
	at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.lang.UnsupportedClassVersionError: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClassFromSelf(ClassRealm.java:389)
	at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:42)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:259)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:235)
	at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:227)
	at org.eclipse.sisu.space.URLClassSpace.loadClass(URLClassSpace.java:107)
	... 41 more
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[WARNING] Rule 1: org.apache.maven.plugins.enforcer.RequireJavaVersion failed with message:
Detected JDK Version: 1.7.0-55 is not in the allowed range [1.8,).
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... FAILURE [  1.749 s]
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ FAILURE [  0.523 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.676 s
[INFO] Finished at: 2016-05-16T23:06:53+00:00
[INFO] Final Memory: 25M/723M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc (compile-protoc) on project hadoop-hdfs-client: Execution compile-protoc of goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT:protoc failed: Unable to load the mojo 'protoc' in the plugin 'org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT' due to an API incompatibility: org.codehaus.plexus.component.repository.exception.ComponentLookupException: org/apache/hadoop/maven/plugin/protoc/ProtocMojo : Unsupported major.minor version 52.0
[ERROR] -----------------------------------------------------
[ERROR] realm =    plugin>org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1-SNAPSHOT
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-maven-plugins/3.0.0-alpha1-SNAPSHOT/hadoop-maven-plugins-3.0.0-alpha1-SNAPSHOT.jar
[ERROR] urls[1] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-inject-bean/1.4.2/sisu-inject-bean-1.4.2.jar
[ERROR] urls[2] = file:/home/jenkins/.m2/repository/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar
[ERROR] urls[3] = file:/home/jenkins/.m2/repository/org/sonatype/aether/aether-util/1.7/aether-util-1.7.jar
[ERROR] urls[4] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.14/plexus-interpolation-1.14.jar
[ERROR] urls[5] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-utils/2.0.5/plexus-utils-2.0.5.jar
[ERROR] urls[6] = file:/home/jenkins/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
[ERROR] urls[7] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar
[ERROR] urls[8] = file:/home/jenkins/.m2/repository/org/sonatype/plexus/plexus-cipher/1.4/plexus-cipher-1.4.jar
[ERROR] urls[9] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
[ERROR] urls[10] = file:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import  from realm ClassRealm[project>org.apache.hadoop:hadoop-hdfs-project:3.0.0-alpha1-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR] 
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (dist-enforce) on project hadoop-hdfs-project: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR] [Help 2] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?


---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-help@hadoop.apache.org