You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Hudson Server <hu...@hudson.apache.org> on 2011/02/01 07:34:20 UTC
Hadoop-Hdfs-trunk-Commit - Build # 524 - Still Failing
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/524/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 146390 lines...]
[junit] 2011-02-01 06:33:34,963 WARN datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(130)) - AsyncDiskService has already shut down.
[junit] 2011-02-01 06:33:34,963 INFO hdfs.MiniDFSCluster (MiniDFSCluster.java:shutdownDataNodes(835)) - Shutting down DataNode 0
[junit] 2011-02-01 06:33:35,065 INFO ipc.Server (Server.java:stop(1610)) - Stopping server on 34496
[junit] 2011-02-01 06:33:35,065 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 1 on 34496: exiting
[junit] 2011-02-01 06:33:35,065 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 2 on 34496: exiting
[junit] 2011-02-01 06:33:35,066 WARN datanode.DataNode (DataXceiverServer.java:run(141)) - DatanodeRegistration(127.0.0.1:56775, storageID=DS-525590048-127.0.1.1-56775-1296542014052, infoPort=48707, ipcPort=34496):DataXceiveServer: java.nio.channels.AsynchronousCloseException
[junit] at java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185)
[junit] at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:152)
[junit] at sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84)
[junit] at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:134)
[junit] at java.lang.Thread.run(Thread.java:619)
[junit]
[junit] 2011-02-01 06:33:35,066 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 0 on 34496: exiting
[junit] 2011-02-01 06:33:35,065 INFO ipc.Server (Server.java:run(475)) - Stopping IPC Server listener on 34496
[junit] 2011-02-01 06:33:35,065 INFO ipc.Server (Server.java:run(675)) - Stopping IPC Server Responder
[junit] 2011-02-01 06:33:35,065 INFO datanode.DataNode (DataNode.java:shutdown(786)) - Waiting for threadgroup to exit, active threads is 1
[junit] 2011-02-01 06:33:35,067 INFO datanode.DataBlockScanner (DataBlockScanner.java:run(622)) - Exiting DataBlockScanner thread.
[junit] 2011-02-01 06:33:35,067 INFO datanode.DataNode (DataNode.java:run(1460)) - DatanodeRegistration(127.0.0.1:56775, storageID=DS-525590048-127.0.1.1-56775-1296542014052, infoPort=48707, ipcPort=34496):Finishing DataNode in: FSDataset{dirpath='/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/data/dfs/data/data1/current/finalized,/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/data/dfs/data/data2/current/finalized'}
[junit] 2011-02-01 06:33:35,067 INFO ipc.Server (Server.java:stop(1610)) - Stopping server on 34496
[junit] 2011-02-01 06:33:35,067 INFO datanode.DataNode (DataNode.java:shutdown(786)) - Waiting for threadgroup to exit, active threads is 0
[junit] 2011-02-01 06:33:35,068 INFO datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(133)) - Shutting down all async disk service threads...
[junit] 2011-02-01 06:33:35,068 INFO datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(142)) - All async disk service threads have been shut down.
[junit] 2011-02-01 06:33:35,068 WARN datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(130)) - AsyncDiskService has already shut down.
[junit] 2011-02-01 06:33:35,170 WARN namenode.FSNamesystem (FSNamesystem.java:run(2845)) - ReplicationMonitor thread received InterruptedException.java.lang.InterruptedException: sleep interrupted
[junit] 2011-02-01 06:33:35,170 WARN namenode.DecommissionManager (DecommissionManager.java:run(70)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
[junit] 2011-02-01 06:33:35,170 INFO namenode.FSEditLog (FSEditLog.java:printStatistics(595)) - Number of transactions: 12 Total time for transactions(ms): 0Number of transactions batched in Syncs: 1 Number of syncs: 9 SyncTimes(ms): 7 6
[junit] 2011-02-01 06:33:35,171 INFO ipc.Server (Server.java:stop(1610)) - Stopping server on 56416
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 0 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 1 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 5 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(475)) - Stopping IPC Server listener on 56416
[junit] 2011-02-01 06:33:35,173 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 9 on 56416: exiting
[junit] 2011-02-01 06:33:35,173 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 8 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 3 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 7 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 6 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 4 on 56416: exiting
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(675)) - Stopping IPC Server Responder
[junit] 2011-02-01 06:33:35,172 INFO ipc.Server (Server.java:run(1443)) - IPC Server handler 2 on 56416: exiting
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 4.908 sec
checkfailure:
[touch] Creating /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:702: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:659: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:727: Tests failed!
Total time: 9 minutes 3 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED: org.apache.hadoop.hdfs.server.namenode.TestStorageRestore.testStorageRestore
Error Message:
Image file /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/data/dfs/secondary/current/fsimage is corrupt with MD5 checksum of a1c7cd655f50e4b443a09a140a13fa35 but expecting b3ecd0b22758686f75d28e493f090ce0
Stack Trace:
java.io.IOException: Image file /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/data/dfs/secondary/current/fsimage is corrupt with MD5 checksum of a1c7cd655f50e4b443a09a140a13fa35 but expecting b3ecd0b22758686f75d28e493f090ce0
at org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSImage(FSImage.java:1063)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.doMerge(SecondaryNameNode.java:702)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.access$500(SecondaryNameNode.java:600)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doMerge(SecondaryNameNode.java:477)
at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:438)
at org.apache.hadoop.hdfs.server.namenode.TestStorageRestore.__CLR3_0_2dn2tm4rsr(TestStorageRestore.java:316)
at org.apache.hadoop.hdfs.server.namenode.TestStorageRestore.testStorageRestore(TestStorageRestore.java:286)
Hadoop-Hdfs-trunk-Commit - Build # 525 - Still Failing
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/525/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1069 lines...]
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/webapps/datanode/WEB-INF
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/webapps/secondary/WEB-INF
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/ant
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/c++
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/hdfs/classes
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/extraconf
[touch] Creating /tmp/null2097763057
[delete] Deleting: /tmp/null2097763057
[copy] Copying 3 files to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/webapps
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/conf
[copy] Copying /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/conf/hdfs-site.xml.template to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/conf/hdfs-site.xml
[mkdir] Created dir: /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/conf
[copy] Copying 1 file to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/conf
[copy] Copying /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/conf/hdfs-site.xml.template to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/conf/hdfs-site.xml
compile-hdfs-classes:
[javac] Compiling 228 source files to /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/classes
[javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode.java:200: cannot find symbol
[javac] symbol : method get(int)
[javac] location: class java.lang.String[]
[javac] Krb5AndCertsSslSocketConnector.KRB5_CIPHER_SUITES.get(0));
[javac] ^
[javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/tools/DFSck.java:200: cannot find symbol
[javac] symbol : method get(int)
[javac] location: class java.lang.String[]
[javac] Krb5AndCertsSslSocketConnector.KRB5_CIPHER_SUITES.get(0));
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 2 errors
BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:344: Compile failed; see the compiler error output for details.
Total time: 15 seconds
======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================
mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.