You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/06/12 03:31:25 UTC

Hadoop-Mapreduce-trunk-Commit - Build # 722 - Failure

See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/722/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12635 lines...]
    [junit] Running org.apache.hadoop.mapred.TestQueueAclsForCurrentUser
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.666 sec
    [junit] Running org.apache.hadoop.mapred.TestRackAwareTaskPlacement
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.886 sec
    [junit] Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
    [junit] Tests run: 0, Failures: 0, Errors: 1, Time elapsed: 14.92 sec
    [junit] Test org.apache.hadoop.mapred.TestReduceFetchFromPartialMem FAILED
    [junit] Running org.apache.hadoop.mapred.TestReduceTask
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.681 sec
    [junit] Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryInputFormat
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.928 sec
    [junit] Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 1.151 sec
    [junit] Running org.apache.hadoop.mapred.TestSequenceFileInputFormat
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.227 sec
    [junit] Running org.apache.hadoop.mapred.TestSeveral
    [junit] Tests run: 0, Failures: 0, Errors: 1, Time elapsed: 14.691 sec
    [junit] Test org.apache.hadoop.mapred.TestSeveral FAILED
    [junit] Running org.apache.hadoop.mapred.TestSpeculativeExecution
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 4.389 sec
    [junit] Running org.apache.hadoop.mapred.TestTaskLimits
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 4.869 sec
    [junit] Running org.apache.hadoop.mapred.TestTaskTrackerBlacklisting
    [junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 1.973 sec
    [junit] Running org.apache.hadoop.mapred.TestTextInputFormat
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 104.074 sec
    [junit] Running org.apache.hadoop.mapred.TestTextOutputFormat
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.21 sec
    [junit] Running org.apache.hadoop.mapred.TestTrackerBlacklistAcrossJobs
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 49.282 sec
    [junit] Running org.apache.hadoop.mapreduce.TestCounters
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.16 sec
    [junit] Running org.apache.hadoop.mapreduce.TestMapCollection
    [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 26.655 sec
    [junit] Running org.apache.hadoop.mapreduce.TestMapReduceLocal
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 30.401 sec
    [junit] Running org.apache.hadoop.mapreduce.lib.input.TestFileInputFormat
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 1.038 sec
    [junit] Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.695 sec

checkfailure:
    [touch] Creating /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/build.xml:807: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/build.xml:770: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/build.xml:831: Tests failed!

Total time: 7 minutes 18 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
REGRESSION:  org.apache.hadoop.mapred.TestCommandLineJobSubmission.testJobShell

Error Message:
com/google/protobuf/MessageOrBuilder

Stack Trace:
java.lang.NoClassDefFoundError: com/google/protobuf/MessageOrBuilder
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
	at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PacketHeader.<clinit>(DataTransferProtocol.java:504)
	at org.apache.hadoop.hdfs.DFSOutputStream.computePacketChunkSize(DFSOutputStream.java:1282)
	at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1237)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:747)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:705)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:255)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:725)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:706)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:605)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:594)
	at org.apache.hadoop.mapred.TestCommandLineJobSubmission.__CLR3_0_25qkf64192v(TestCommandLineJobSubmission.java:55)
	at org.apache.hadoop.mapred.TestCommandLineJobSubmission.testJobShell(TestCommandLineJobSubmission.java:45)
Caused by: java.lang.ClassNotFoundException: com.google.protobuf.MessageOrBuilder
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)


REGRESSION:  org.apache.hadoop.mapred.TestFileInputFormat.testLocality

Error Message:
com/google/protobuf/MessageOrBuilder

Stack Trace:
java.lang.NoClassDefFoundError: com/google/protobuf/MessageOrBuilder
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
	at org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PacketHeader.<clinit>(DataTransferProtocol.java:504)
	at org.apache.hadoop.hdfs.DFSOutputStream.computePacketChunkSize(DFSOutputStream.java:1282)
	at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1237)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:747)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:705)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:255)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:725)
	at org.apache.hadoop.mapred.TestFileInputFormat.createInputs(TestFileInputFormat.java:95)
	at org.apache.hadoop.mapred.TestFileInputFormat.__CLR3_0_2b6vakk1d8l(TestFileInputFormat.java:56)
	at org.apache.hadoop.mapred.TestFileInputFormat.testLocality(TestFileInputFormat.java:48)
Caused by: java.lang.ClassNotFoundException: com.google.protobuf.MessageOrBuilder
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)


REGRESSION:  org.apache.hadoop.mapred.TestFileInputFormat.testNumInputs

Error Message:
Could not initialize class org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PacketHeader

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PacketHeader
	at org.apache.hadoop.hdfs.DFSOutputStream.computePacketChunkSize(DFSOutputStream.java:1282)
	at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1237)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:747)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:705)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:255)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:725)
	at org.apache.hadoop.mapred.TestFileInputFormat.createInputs(TestFileInputFormat.java:95)
	at org.apache.hadoop.mapred.TestFileInputFormat.__CLR3_0_2xfe8n81d9o(TestFileInputFormat.java:114)
	at org.apache.hadoop.mapred.TestFileInputFormat.testNumInputs(TestFileInputFormat.java:104)


REGRESSION:  org.apache.hadoop.mapred.TestFileInputFormat.testMultiLevelInput

Error Message:
Could not initialize class org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PacketHeader

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hdfs.protocol.DataTransferProtocol$PacketHeader
	at org.apache.hadoop.hdfs.DFSOutputStream.computePacketChunkSize(DFSOutputStream.java:1282)
	at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1237)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:747)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:705)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:255)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:725)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:706)
	at org.apache.hadoop.mapred.TestFileInputFormat.writeFile(TestFileInputFormat.java:193)
	at org.apache.hadoop.mapred.TestFileInputFormat.__CLR3_0_29v537w1da9(TestFileInputFormat.java:166)
	at org.apache.hadoop.mapred.TestFileInputFormat.testMultiLevelInput(TestFileInputFormat.java:152)


REGRESSION:  org.apache.hadoop.mapred.TestMiniMRDFSCaching.testWithDFS

Error Message:
java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused

Stack Trace:
java.lang.RuntimeException: java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused
	at org.apache.hadoop.mapred.MiniMRCluster.waitUntilIdle(MiniMRCluster.java:336)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:546)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:483)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:475)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:467)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:459)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:449)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:439)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:430)
	at org.apache.hadoop.mapred.TestMiniMRDFSCaching.__CLR3_0_2xybd4w1qx8(TestMiniMRDFSCaching.java:41)
	at org.apache.hadoop.mapred.TestMiniMRDFSCaching.testWithDFS(TestMiniMRDFSCaching.java:33)
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused
	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1079)
	at org.apache.hadoop.ipc.Client.call(Client.java:1055)
	at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:250)
	at $Proxy12.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.mapreduce.Cluster.getClusterStatus(Cluster.java:200)
	at org.apache.hadoop.mapred.JobClient.getClusterStatus(JobClient.java:677)
	at org.apache.hadoop.mapred.MiniMRCluster.waitUntilIdle(MiniMRCluster.java:323)
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:375)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:440)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:528)
	at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:209)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1188)
	at org.apache.hadoop.ipc.Client.call(Client.java:1032)


FAILED:  org.apache.hadoop.mapred.TestReduceFetchFromPartialMem$1.unknown

Error Message:
java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused

Stack Trace:
java.lang.RuntimeException: java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused
	at org.apache.hadoop.mapred.MiniMRCluster.waitUntilIdle(MiniMRCluster.java:336)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:546)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:483)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:475)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:467)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:459)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:449)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:439)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:430)
	at org.apache.hadoop.mapred.TestReduceFetchFromPartialMem$1.setUp(TestReduceFetchFromPartialMem.java:60)
	at junit.extensions.TestSetup$1.protect(TestSetup.java:22)
	at junit.extensions.TestSetup.run(TestSetup.java:27)
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused
	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1079)
	at org.apache.hadoop.ipc.Client.call(Client.java:1055)
	at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:250)
	at $Proxy12.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.mapreduce.Cluster.getClusterStatus(Cluster.java:200)
	at org.apache.hadoop.mapred.JobClient.getClusterStatus(JobClient.java:677)
	at org.apache.hadoop.mapred.MiniMRCluster.waitUntilIdle(MiniMRCluster.java:323)
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:375)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:440)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:528)
	at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:209)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1188)
	at org.apache.hadoop.ipc.Client.call(Client.java:1032)


FAILED:  org.apache.hadoop.mapred.TestSeveral$1.unknown

Error Message:
java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused

Stack Trace:
java.lang.RuntimeException: java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused
	at org.apache.hadoop.mapred.MiniMRCluster.waitUntilIdle(MiniMRCluster.java:336)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:546)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:483)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:475)
	at org.apache.hadoop.mapred.TestSeveral$1.setUp(TestSeveral.java:108)
	at junit.extensions.TestSetup$1.protect(TestSetup.java:22)
	at junit.extensions.TestSetup.run(TestSetup.java:27)
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:0 failed on connection exception: java.net.ConnectException: Connection refused
	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1079)
	at org.apache.hadoop.ipc.Client.call(Client.java:1055)
	at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:250)
	at $Proxy12.getClusterMetrics(Unknown Source)
	at org.apache.hadoop.mapreduce.Cluster.getClusterStatus(Cluster.java:200)
	at org.apache.hadoop.mapred.JobClient.getClusterStatus(JobClient.java:677)
	at org.apache.hadoop.mapred.MiniMRCluster.waitUntilIdle(MiniMRCluster.java:323)
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:375)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:440)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:528)
	at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:209)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1188)
	at org.apache.hadoop.ipc.Client.call(Client.java:1032)




Hadoop-Mapreduce-trunk-Commit - Build # 724 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/724/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6226 lines...]
    [javac] symbol  : class DataTransferProtocol
    [javac] location: package org.apache.hadoop.hdfs.protocol
    [javac] import org.apache.hadoop.hdfs.protocol.DataTransferProtocol;
    [javac]                                       ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/datanode/RaidBlockSender.java:250: cannot find symbol
    [javac] symbol  : class PacketHeader
    [javac] location: class org.apache.hadoop.hdfs.server.datanode.RaidBlockSender
    [javac]     PacketHeader header = new PacketHeader(
    [javac]     ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/datanode/RaidBlockSender.java:250: cannot find symbol
    [javac] symbol  : class PacketHeader
    [javac] location: class org.apache.hadoop.hdfs.server.datanode.RaidBlockSender
    [javac]     PacketHeader header = new PacketHeader(
    [javac]                               ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/datanode/RaidBlockSender.java:379: cannot find symbol
    [javac] symbol  : variable PacketHeader
    [javac] location: class org.apache.hadoop.hdfs.server.datanode.RaidBlockSender
    [javac]       int pktSize = PacketHeader.PKT_HEADER_LEN;
    [javac]                     ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/raid/BlockFixer.java:784: package DataTransferProtocol does not exist
    [javac]                                                  DataTransferProtocol.
    [javac]                                                                      ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/raid/BlockFixer.java:783: package DataTransferProtocol does not exist
    [javac]         DataTransferProtocol.Sender.opWriteBlock(out, block.getBlock(), 1,
    [javac]                             ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 7 errors

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/build.xml:450: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/build.xml:30: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/build-contrib.xml:194: Compile failed; see the compiler error output for details.

Total time: 14 minutes 14 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk-Commit - Build # 723 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/723/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6236 lines...]
    [javac] symbol  : class DataTransferProtocol
    [javac] location: package org.apache.hadoop.hdfs.protocol
    [javac] import org.apache.hadoop.hdfs.protocol.DataTransferProtocol;
    [javac]                                       ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/datanode/RaidBlockSender.java:250: cannot find symbol
    [javac] symbol  : class PacketHeader
    [javac] location: class org.apache.hadoop.hdfs.server.datanode.RaidBlockSender
    [javac]     PacketHeader header = new PacketHeader(
    [javac]     ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/datanode/RaidBlockSender.java:250: cannot find symbol
    [javac] symbol  : class PacketHeader
    [javac] location: class org.apache.hadoop.hdfs.server.datanode.RaidBlockSender
    [javac]     PacketHeader header = new PacketHeader(
    [javac]                               ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/datanode/RaidBlockSender.java:379: cannot find symbol
    [javac] symbol  : variable PacketHeader
    [javac] location: class org.apache.hadoop.hdfs.server.datanode.RaidBlockSender
    [javac]       int pktSize = PacketHeader.PKT_HEADER_LEN;
    [javac]                     ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/raid/BlockFixer.java:784: package DataTransferProtocol does not exist
    [javac]                                                  DataTransferProtocol.
    [javac]                                                                      ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/raid/src/java/org/apache/hadoop/raid/BlockFixer.java:783: package DataTransferProtocol does not exist
    [javac]         DataTransferProtocol.Sender.opWriteBlock(out, block.getBlock(), 1,
    [javac]                             ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 7 errors

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/build.xml:450: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/build.xml:30: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk-Commit/trunk/src/contrib/build-contrib.xml:194: Compile failed; see the compiler error output for details.

Total time: 12 minutes 24 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.