You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Hudson Server <hu...@hudson.apache.org> on 2011/02/11 00:02:04 UTC
Hadoop-Hdfs-22-branch - Build # 22 - Still Failing
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/22/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2906 lines...]
[junit] Running org.apache.hadoop.hdfs.TestDFSClientRetries
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 43.019 sec
[junit] Running org.apache.hadoop.hdfs.TestDFSPermission
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 17.339 sec
[junit] Running org.apache.hadoop.hdfs.TestDFSRemove
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 14.454 sec
[junit] Running org.apache.hadoop.hdfs.TestDFSStartupVersions
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 19.253 sec
[junit] Running org.apache.hadoop.hdfs.TestDFSUpgrade
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 27.149 sec
[junit] Running org.apache.hadoop.hdfs.TestDFSUtil
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.17 sec
[junit] Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 97.15 sec
[junit] Running org.apache.hadoop.hdfs.TestDatanodeConfig
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.451 sec
[junit] Running org.apache.hadoop.hdfs.TestDatanodeDeath
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 129.106 sec
[junit] Running org.apache.hadoop.hdfs.TestDatanodeRegistration
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.808 sec
[junit] Running org.apache.hadoop.hdfs.TestDecommission
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 32.382 sec
[junit] Running org.apache.hadoop.hdfs.TestDeprecatedKeys
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.257 sec
[junit] Running org.apache.hadoop.hdfs.TestDfsOverAvroRpc
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.394 sec
[junit] Running org.apache.hadoop.hdfs.TestFileAppend4
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 9.728 sec
[junit] Running org.apache.hadoop.hdfs.TestFileConcurrentReader
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 31.532 sec
[junit] Running org.apache.hadoop.hdfs.TestFileCreation
[junit] Tests run: 12, Failures: 0, Errors: 0, Time elapsed: 47.992 sec
[junit] Running org.apache.hadoop.hdfs.TestFileCreationClient
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 9.609 sec
[junit] Running org.apache.hadoop.hdfs.TestFileCreationDelete
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 12.825 sec
[junit] Running org.apache.hadoop.hdfs.TestFileCreationEmpty
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 8.136 sec
[junit] Running org.apache.hadoop.hdfs.TestHDFSFileSystemContract
[junit] Tests run: 28, Failures: 0, Errors: 0, Time elapsed: 35.695 sec
[junit] Running org.apache.hadoop.hdfs.TestHFlush
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 19.144 sec
[junit] Running org.apache.hadoop.hdfs.TestHftpFileSystem
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 2.738 sec
[junit] Running org.apache.hadoop.hdfs.TestInjectionForSimulatedStorage
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 14.365 sec
[junit] Running org.apache.hadoop.hdfs.TestLargeBlock
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 44.525 sec
[junit] Running org.apache.hadoop.hdfs.TestLeaseRecovery2
Build timed out. Aborting
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED: TEST-org.apache.hadoop.hdfs.TestLeaseRecovery2.xml.<init>
Error Message:
Stack Trace:
Test report file /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/TEST-org.apache.hadoop.hdfs.TestLeaseRecovery2.xml was length 0
Hadoop-Hdfs-22-branch - Build # 31 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/31/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3548 lines...]
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target
[echo] Including clover.jar in the war file ...
[cactifywar] Analyzing war: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/hdfsproxy-2.0-test.war
[cactifywar] Building war: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/test.war
cactifywar:
test-cactus:
[echo] Free Ports: startup-41552 / http-41553 / https-41554
[echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/logs
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/reports
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[cactus] -----------------------------------------------------------------
[cactus] Running tests against Tomcat 5.x @ http://localhost:41553
[cactus] -----------------------------------------------------------------
[cactus] Deploying [/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/test.war] to [/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]...
[cactus] Tomcat 5.x starting...
Server [Apache-Coyote/1.1] started
[cactus] WARNING: multiple versions of ant detected in path for junit
[cactus] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[cactus] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[cactus] Running org.apache.hadoop.hdfsproxy.TestAuthorizationFilter
[cactus] Tests run: 4, Failures: 2, Errors: 0, Time elapsed: 0.497 sec
[cactus] Test org.apache.hadoop.hdfsproxy.TestAuthorizationFilter FAILED
[cactus] Running org.apache.hadoop.hdfsproxy.TestLdapIpDirFilter
[cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.577 sec
[cactus] Tomcat 5.x started on port [41553]
[cactus] Running org.apache.hadoop.hdfsproxy.TestProxyFilter
[cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.358 sec
[cactus] Running org.apache.hadoop.hdfsproxy.TestProxyForwardServlet
[cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.316 sec
[cactus] Running org.apache.hadoop.hdfsproxy.TestProxyUtil
[cactus] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.855 sec
[cactus] Tomcat 5.x is stopping...
[cactus] Tomcat 5.x is stopped
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:749: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:730: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/contrib/build.xml:48: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/contrib/hdfsproxy/build.xml:343: Tests failed!
Total time: 60 minutes 38 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED: org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.testPathPermit
Error Message:
expected:<403> but was:<200>
Stack Trace:
junit.framework.AssertionFailedError: expected:<403> but was:<200>
at org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.endPathPermit(TestAuthorizationFilter.java:113)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callGenericEndMethod(ClientTestCaseCaller.java:442)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callEndMethod(ClientTestCaseCaller.java:209)
at org.apache.cactus.internal.client.ClientTestCaseCaller.runTest(ClientTestCaseCaller.java:149)
at org.apache.cactus.internal.AbstractCactusTestCase.runBareClient(AbstractCactusTestCase.java:218)
at org.apache.cactus.internal.AbstractCactusTestCase.runBare(AbstractCactusTestCase.java:134)
FAILED: org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.testPathPermitQualified
Error Message:
expected:<403> but was:<200>
Stack Trace:
junit.framework.AssertionFailedError: expected:<403> but was:<200>
at org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.endPathPermitQualified(TestAuthorizationFilter.java:136)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callGenericEndMethod(ClientTestCaseCaller.java:442)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callEndMethod(ClientTestCaseCaller.java:209)
at org.apache.cactus.internal.client.ClientTestCaseCaller.runTest(ClientTestCaseCaller.java:149)
at org.apache.cactus.internal.AbstractCactusTestCase.runBareClient(AbstractCactusTestCase.java:218)
at org.apache.cactus.internal.AbstractCactusTestCase.runBare(AbstractCactusTestCase.java:134)
Hadoop-Hdfs-22-branch - Build # 30 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/30/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3318 lines...]
compile-hdfs-test:
[delete] Deleting directory /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
run-test-hdfs-excluding-commit-and-smoke:
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/data
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/logs
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[junit] WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Running org.apache.hadoop.fs.TestFiListPath
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 2.156 sec
[junit] Running org.apache.hadoop.fs.TestFiRename
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 5.249 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHFlush
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 15.641 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHftp
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 33.541 sec
[junit] Running org.apache.hadoop.hdfs.TestFiPipelines
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.809 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol
[junit] Tests run: 29, Failures: 0, Errors: 0, Time elapsed: 211.456 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol2
[junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 463.717 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 35.344 sec
checkfailure:
-run-test-hdfs-fault-inject-withtestcaseonly:
run-test-hdfs-fault-inject:
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:745: Tests failed!
Total time: 50 minutes 16 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION: org.apache.hadoop.hdfs.server.namenode.TestBlockTokenWithDFS.testEnd2End
Error Message:
127.0.0.1:55394is not an underUtilized node
Stack Trace:
junit.framework.AssertionFailedError: 127.0.0.1:55394is not an underUtilized node
at org.apache.hadoop.hdfs.server.balancer.Balancer.initNodes(Balancer.java:1011)
at org.apache.hadoop.hdfs.server.balancer.Balancer.initNodes(Balancer.java:953)
at org.apache.hadoop.hdfs.server.balancer.Balancer.run(Balancer.java:1496)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.runBalancer(TestBalancer.java:247)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.test(TestBalancer.java:234)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.oneNodeTest(TestBalancer.java:307)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.integrationTest(TestBalancer.java:319)
at org.apache.hadoop.hdfs.server.namenode.TestBlockTokenWithDFS.__CLR3_0_2wspf0n10tj(TestBlockTokenWithDFS.java:529)
at org.apache.hadoop.hdfs.server.namenode.TestBlockTokenWithDFS.testEnd2End(TestBlockTokenWithDFS.java:526)
Hadoop-Hdfs-22-branch - Build # 29 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/29/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3320 lines...]
compile-hdfs-test:
[delete] Deleting directory /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
run-test-hdfs-excluding-commit-and-smoke:
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/data
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/logs
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[junit] WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Running org.apache.hadoop.fs.TestFiListPath
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 2.207 sec
[junit] Running org.apache.hadoop.fs.TestFiRename
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 5.608 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHFlush
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 15.64 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHftp
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 43.384 sec
[junit] Running org.apache.hadoop.hdfs.TestFiPipelines
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.501 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol
[junit] Tests run: 29, Failures: 0, Errors: 0, Time elapsed: 210.721 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol2
[junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 416.025 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 35.602 sec
checkfailure:
-run-test-hdfs-fault-inject-withtestcaseonly:
run-test-hdfs-fault-inject:
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:745: Tests failed!
Total time: 50 minutes 44 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
REGRESSION: org.apache.hadoop.hdfs.server.balancer.TestBalancer.testBalancer0
Error Message:
127.0.0.1:59191is not an underUtilized node
Stack Trace:
junit.framework.AssertionFailedError: 127.0.0.1:59191is not an underUtilized node
at org.apache.hadoop.hdfs.server.balancer.Balancer.initNodes(Balancer.java:1011)
at org.apache.hadoop.hdfs.server.balancer.Balancer.initNodes(Balancer.java:953)
at org.apache.hadoop.hdfs.server.balancer.Balancer.run(Balancer.java:1496)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.runBalancer(TestBalancer.java:247)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.test(TestBalancer.java:234)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.twoNodeTest(TestBalancer.java:312)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.__CLR3_0_29j3j5bp34(TestBalancer.java:328)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.testBalancer0(TestBalancer.java:324)
REGRESSION: org.apache.hadoop.hdfs.server.datanode.TestBlockReport.blockReport_08
Error Message:
Was waiting too long for a replica to become TEMPORARY
Stack Trace:
junit.framework.AssertionFailedError: Was waiting too long for a replica to become TEMPORARY
at org.apache.hadoop.hdfs.server.datanode.TestBlockReport.waitForTempReplica(TestBlockReport.java:514)
at org.apache.hadoop.hdfs.server.datanode.TestBlockReport.__CLR3_0_2j2e00j11c8(TestBlockReport.java:408)
at org.apache.hadoop.hdfs.server.datanode.TestBlockReport.blockReport_08(TestBlockReport.java:390)
Hadoop-Hdfs-22-branch - Build # 28 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/28/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2384 lines...]
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/ivy
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/ivy
clean-sign:
sign:
signanddeploy:
simpledeploy:
[artifact:install-provider] Installing provider: org.apache.maven.wagon:wagon-http:jar:1.0-beta-2:runtime
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-http/1.0-beta-2/wagon-http-1.0-beta-2.pom from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-providers/1.0-beta-2/wagon-providers-1.0-beta-2.pom from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon/1.0-beta-2/wagon-1.0-beta-2.pom from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-http-shared/1.0-beta-2/wagon-http-shared-1.0-beta-2.pom from central
[artifact:install-provider] Downloading: jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.pom from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-provider-api/1.0-beta-2/wagon-provider-api-1.0-beta-2.pom from central
[artifact:install-provider] Downloading: commons-logging/commons-logging/1.0.3/commons-logging-1.0.3.pom from central
[artifact:install-provider] Downloading: commons-httpclient/commons-httpclient/2.0.2/commons-httpclient-2.0.2.pom from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-http/1.0-beta-2/wagon-http-1.0-beta-2.jar from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-http-shared/1.0-beta-2/wagon-http-shared-1.0-beta-2.jar from central
[artifact:install-provider] Downloading: jtidy/jtidy/4aug2000r7-dev/jtidy-4aug2000r7-dev.jar from central
[artifact:install-provider] Downloading: org/apache/maven/wagon/wagon-provider-api/1.0-beta-2/wagon-provider-api-1.0-beta-2.jar from central
[artifact:install-provider] Downloading: org/codehaus/plexus/plexus-utils/1.0.4/plexus-utils-1.0.4.jar from central
[artifact:install-provider] Downloading: commons-logging/commons-logging/1.0.3/commons-logging-1.0.3.jar from central
[artifact:install-provider] Downloading: commons-httpclient/commons-httpclient/2.0.2/commons-httpclient-2.0.2.jar from central
[artifact:deploy] Deploying to https://repository.apache.org/content/repositories/snapshots
[artifact:deploy] [INFO] Retrieving previous build number from apache.snapshots.https
[artifact:deploy] Uploading: org/apache/hadoop/hadoop-hdfs/0.22.0-SNAPSHOT/hadoop-hdfs-0.22.0-20110307.223307-340.jar to apache.snapshots.https
[artifact:deploy] Uploaded 1013K
[artifact:deploy] An error has occurred while processing the Maven artifact tasks.
[artifact:deploy] Diagnosis:
[artifact:deploy]
[artifact:deploy] Error deploying artifact 'org.apache.hadoop:hadoop-hdfs:jar': Error deploying artifact: Failed to transfer file: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.22.0-SNAPSHOT/hadoop-hdfs-0.22.0-20110307.223307-340.jar. Return code is: 401
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:1669: Error deploying artifact 'org.apache.hadoop:hadoop-hdfs:jar': Error deploying artifact: Failed to transfer file: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.22.0-SNAPSHOT/hadoop-hdfs-0.22.0-20110307.223307-340.jar. Return code is: 401
Total time: 1 minute 45 seconds
======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================
mv: cannot stat `build/test/findbugs': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.
Hadoop-Hdfs-22-branch - Build # 27 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/27/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3300 lines...]
compile-hdfs-test:
[delete] Deleting directory /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
run-test-hdfs-excluding-commit-and-smoke:
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/data
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/logs
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[junit] WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Running org.apache.hadoop.fs.TestFiListPath
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 2.287 sec
[junit] Running org.apache.hadoop.fs.TestFiRename
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 5.499 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHFlush
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 15.502 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHftp
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 38.621 sec
[junit] Running org.apache.hadoop.hdfs.TestFiPipelines
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.427 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol
[junit] Tests run: 29, Failures: 0, Errors: 0, Time elapsed: 211.601 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol2
[junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 398.931 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 35.447 sec
checkfailure:
-run-test-hdfs-fault-inject-withtestcaseonly:
run-test-hdfs-fault-inject:
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:745: Tests failed!
Total time: 103 minutes 25 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
REGRESSION: org.apache.hadoop.cli.TestHDFSCLI.testAll
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
REGRESSION: org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery.testErrorReplicas
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Hadoop-Hdfs-22-branch - Build # 26 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/26/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3520 lines...]
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target
[echo] Including clover.jar in the war file ...
[cactifywar] Analyzing war: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/hdfsproxy-2.0-test.war
[cactifywar] Building war: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/test.war
cactifywar:
test-cactus:
[echo] Free Ports: startup-41060 / http-41061 / https-41062
[echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/logs
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/reports
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf
[cactus] -----------------------------------------------------------------
[cactus] Running tests against Tomcat 5.x @ http://localhost:41061
[cactus] -----------------------------------------------------------------
[cactus] Deploying [/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/test.war] to [/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]...
[cactus] Tomcat 5.x starting...
Server [Apache-Coyote/1.1] started
[cactus] WARNING: multiple versions of ant detected in path for junit
[cactus] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[cactus] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[cactus] Running org.apache.hadoop.hdfsproxy.TestAuthorizationFilter
[cactus] Tests run: 4, Failures: 2, Errors: 0, Time elapsed: 0.681 sec
[cactus] Test org.apache.hadoop.hdfsproxy.TestAuthorizationFilter FAILED
[cactus] Running org.apache.hadoop.hdfsproxy.TestLdapIpDirFilter
[cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.331 sec
[cactus] Tomcat 5.x started on port [41061]
[cactus] Running org.apache.hadoop.hdfsproxy.TestProxyFilter
[cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.341 sec
[cactus] Running org.apache.hadoop.hdfsproxy.TestProxyForwardServlet
[cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.342 sec
[cactus] Running org.apache.hadoop.hdfsproxy.TestProxyUtil
[cactus] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.856 sec
[cactus] Tomcat 5.x is stopping...
[cactus] Tomcat 5.x is stopped
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:749: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:730: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/contrib/build.xml:48: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/contrib/hdfsproxy/build.xml:343: Tests failed!
Total time: 59 minutes 19 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED: org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.testPathPermit
Error Message:
expected:<403> but was:<200>
Stack Trace:
junit.framework.AssertionFailedError: expected:<403> but was:<200>
at org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.endPathPermit(TestAuthorizationFilter.java:113)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callGenericEndMethod(ClientTestCaseCaller.java:442)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callEndMethod(ClientTestCaseCaller.java:209)
at org.apache.cactus.internal.client.ClientTestCaseCaller.runTest(ClientTestCaseCaller.java:149)
at org.apache.cactus.internal.AbstractCactusTestCase.runBareClient(AbstractCactusTestCase.java:218)
at org.apache.cactus.internal.AbstractCactusTestCase.runBare(AbstractCactusTestCase.java:134)
FAILED: org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.testPathPermitQualified
Error Message:
expected:<403> but was:<200>
Stack Trace:
junit.framework.AssertionFailedError: expected:<403> but was:<200>
at org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.endPathPermitQualified(TestAuthorizationFilter.java:136)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callGenericEndMethod(ClientTestCaseCaller.java:442)
at org.apache.cactus.internal.client.ClientTestCaseCaller.callEndMethod(ClientTestCaseCaller.java:209)
at org.apache.cactus.internal.client.ClientTestCaseCaller.runTest(ClientTestCaseCaller.java:149)
at org.apache.cactus.internal.AbstractCactusTestCase.runBareClient(AbstractCactusTestCase.java:218)
at org.apache.cactus.internal.AbstractCactusTestCase.runBare(AbstractCactusTestCase.java:134)
Hadoop-Hdfs-22-branch - Build # 25 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/25/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2808 lines...]
[junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1404)
[junit] )
[junit] Running org.apache.hadoop.hdfs.TestFileAppend3
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.TestFileAppend3 FAILED (timeout)
[junit] Running org.apache.hadoop.hdfs.TestFileCorruption
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 593.727 sec
[junit] Running org.apache.hadoop.hdfs.TestFileStatus
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 150.045 sec
[junit] Running org.apache.hadoop.hdfs.TestGetBlocks
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 234.16 sec
[junit] Running org.apache.hadoop.hdfs.TestHDFSServerPorts
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 67.82 sec
[junit] Running org.apache.hadoop.hdfs.TestHDFSTrash
[junit] Tests run: 3, Failures: 1, Errors: 0, Time elapsed: 319.184 sec
[junit] Test org.apache.hadoop.hdfs.TestHDFSTrash FAILED
[junit] Running org.apache.hadoop.hdfs.TestLease
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 234.076 sec
[junit] Running org.apache.hadoop.hdfs.TestLeaseRecovery
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 499.446 sec
[junit] Running org.apache.hadoop.hdfs.TestLocalDFS
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 149.927 sec
[junit] Running org.apache.hadoop.hdfs.TestMissingBlocksAlert
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 151.653 sec
[junit] Running org.apache.hadoop.hdfs.TestPread
[junit] Running org.apache.hadoop.hdfs.TestPread
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.TestPread FAILED (timeout)
[junit] Running org.apache.hadoop.hdfs.TestQuota
[junit] Running org.apache.hadoop.hdfs.TestQuota
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.TestQuota FAILED (timeout)
[junit] Running org.apache.hadoop.hdfs.TestRestartDFS
[junit] Running org.apache.hadoop.hdfs.TestRestartDFS
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.TestRestartDFS FAILED (timeout)
[junit] Running org.apache.hadoop.hdfs.TestSafeMode
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 194.815 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 421.968 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 150.91 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 638.218 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 656.247 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.703 sec
[junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
Build timed out. Aborting
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
11 tests failed.
FAILED: org.apache.hadoop.hdfs.TestDFSShell.testErrOutPut
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestDFSUpgradeFromImage.testUpgradeFromImage
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestDistributedFileSystem.testAllWithDualPort
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestFileAppend.testComplexFlush
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestFileAppend3.testAppendToPartialChunk
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestHDFSTrash.testTrashEmptier
Error Message:
null
Stack Trace:
junit.framework.AssertionFailedError: null
at org.apache.hadoop.fs.TestTrash.testTrashEmptier(TestTrash.java:460)
at junit.extensions.TestDecorator.basicRun(TestDecorator.java:24)
at junit.extensions.TestSetup$1.protect(TestSetup.java:23)
at junit.extensions.TestSetup.run(TestSetup.java:27)
FAILED: org.apache.hadoop.hdfs.TestPread.testPreadDFSSimulated
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestQuota.testMultipleFilesSmallerThanOneBlock
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: org.apache.hadoop.hdfs.TestRestartDFS.testRestartDualPortDFS
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
FAILED: TEST-org.apache.hadoop.hdfs.server.namenode.TestBackupNode.xml.<init>
Error Message:
Stack Trace:
Test report file /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/TEST-org.apache.hadoop.hdfs.server.namenode.TestBackupNode.xml was length 0
Hadoop-Hdfs-22-branch - Build # 24 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/24/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2708 lines...]
ivy-resolve-test:
ivy-retrieve-test:
compile-hdfs-test:
[delete] Deleting directory /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/cache
run-commit-test:
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/data
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/logs
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/extraconf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/extraconf
[junit] WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery FAILED (timeout)
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestDataDirs
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.525 sec
[junit] Running org.apache.hadoop.hdfs.server.namenode.TestGetImageServlet
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.48 sec
[junit] Running org.apache.hadoop.hdfs.server.namenode.TestINodeFile
[junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.225 sec
[junit] Running org.apache.hadoop.hdfs.server.namenode.TestNNLeaseRecovery
[junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 2.348 sec
checkfailure:
[touch] Creating /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/testsfailed
[delete] Deleting directory /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/data
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/data
[delete] Deleting directory /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/logs
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/logs
[junit] WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Running org.apache.hadoop.cli.TestHDFSCLI
Build timed out. Aborting
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED: TEST-org.apache.hadoop.cli.TestHDFSCLI.xml.<init>
Error Message:
Stack Trace:
Test report file /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/TEST-org.apache.hadoop.cli.TestHDFSCLI.xml was length 0
REGRESSION: org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery.testErrorReplicas
Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.
Hadoop-Hdfs-22-branch - Build # 23 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-22-branch/23/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2966 lines...]
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 37.143 sec
[junit] Running org.apache.hadoop.hdfs.TestLeaseRecovery2
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 72.192 sec
[junit] Running org.apache.hadoop.hdfs.TestListFilesInDFS
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 2.605 sec
[junit] Running org.apache.hadoop.hdfs.TestListFilesInFileContext
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 2.541 sec
[junit] Running org.apache.hadoop.hdfs.TestListPathServlet
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.507 sec
[junit] Running org.apache.hadoop.hdfs.TestModTime
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.857 sec
[junit] Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 13.599 sec
[junit] Running org.apache.hadoop.hdfs.TestPipelines
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.217 sec
[junit] Running org.apache.hadoop.hdfs.TestReadWhileWriting
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 5.271 sec
[junit] Running org.apache.hadoop.hdfs.TestRenameWhileOpen
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 51.069 sec
[junit] Running org.apache.hadoop.hdfs.TestReplication
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 23.609 sec
[junit] Running org.apache.hadoop.hdfs.TestSeekBug
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 2.799 sec
[junit] Running org.apache.hadoop.hdfs.TestSetTimes
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 8.028 sec
[junit] Running org.apache.hadoop.hdfs.TestSetrepDecreasing
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 14.912 sec
[junit] Running org.apache.hadoop.hdfs.TestSetrepIncreasing
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 27.698 sec
[junit] Running org.apache.hadoop.hdfs.TestSmallBlock
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.381 sec
[junit] Running org.apache.hadoop.hdfs.TestWriteConfigurationToDFS
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.095 sec
[junit] Running org.apache.hadoop.hdfs.security.TestDelegationTokenForProxyUser
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.276 sec
[junit] Running org.apache.hadoop.hdfs.security.token.block.TestBlockToken
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.871 sec
[junit] Running org.apache.hadoop.hdfs.server.balancer.TestBalancer
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 45.653 sec
[junit] Running org.apache.hadoop.hdfs.server.common.TestDistributedUpgrade
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 27.093 sec
[junit] Running org.apache.hadoop.hdfs.server.common.TestGetUriFromString
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.158 sec
[junit] Running org.apache.hadoop.hdfs.server.common.TestJspHelper
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.964 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReport
Build timed out. Aborting
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReport
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.server.datanode.TestBlockReport FAILED (crashed)
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
REGRESSION: org.apache.hadoop.hdfs.TestFileConcurrentReader.testUnfinishedBlockCRCErrorNormalTransfer
Error Message:
java.io.FileNotFoundException: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/classes/hdfs-default.xml (Too many open files)
Stack Trace:
java.lang.RuntimeException: java.io.FileNotFoundException: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/classes/hdfs-default.xml (Too many open files)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1536)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1401)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1347)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:600)
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:794)
at org.apache.hadoop.hdfs.TestFileConcurrentReader.runTestUnfinishedBlockCRCError(TestFileConcurrentReader.java:313)
at org.apache.hadoop.hdfs.TestFileConcurrentReader.runTestUnfinishedBlockCRCError(TestFileConcurrentReader.java:302)
at org.apache.hadoop.hdfs.TestFileConcurrentReader.__CLR3_0_2k9gmsjsbs(TestFileConcurrentReader.java:285)
at org.apache.hadoop.hdfs.TestFileConcurrentReader.testUnfinishedBlockCRCErrorNormalTransfer(TestFileConcurrentReader.java:284)
Caused by: java.io.FileNotFoundException: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/classes/hdfs-default.xml (Too many open files)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:106)
at java.io.FileInputStream.<init>(FileInputStream.java:66)
at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:70)
at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:161)
at com.sun.org.apache.xerces.internal.impl.XMLEntityManager.setupCurrentEntity(XMLEntityManager.java:653)
at com.sun.org.apache.xerces.internal.impl.XMLVersionDetector.determineDocVersion(XMLVersionDetector.java:186)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:772)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:119)
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:235)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:284)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:180)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1450)
FAILED: org.apache.hadoop.hdfs.server.datanode.TestBlockReport.blockReport_06
Error Message:
Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit.
Stack Trace:
junit.framework.AssertionFailedError: Forked Java VM exited abnormally. Please note the time in the report does not reflect the time until the VM exit.
FAILED: TEST-org.apache.hadoop.hdfs.server.datanode.TestDataNodeMXBean.xml.<init>
Error Message:
Stack Trace:
Test report file /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build/test/TEST-org.apache.hadoop.hdfs.server.datanode.TestDataNodeMXBean.xml was length 0