You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kafka.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2012/08/15 18:16:19 UTC

Build failed in Jenkins: Kafka-0.8 #21

See <https://builds.apache.org/job/Kafka-0.8/21/changes>

Changes:

[junrao] KafkaController.RequestSendThread can throw exception on broker socket; patched by Yang Ye; reviewed by Jun Rao; KAFKA-459, KAFKA-460

------------------------------------------
[...truncated 964 lines...]
[info] Test Starting: testFileSize
[info] Test Passed: testFileSize
[info] Test Starting: testIterationOverPartialAndTruncation
[info] Test Passed: testIterationOverPartialAndTruncation
[info] Test Starting: testIterationDoesntChangePosition
[info] Test Passed: testIterationDoesntChangePosition
[info] Test Starting: testRead
[info] Test Passed: testRead
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] 
[info] == core-kafka / kafka.server.LogRecoveryTest ==
[info] Test Starting: testHWCheckpointNoFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointNoFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Starting: testHWCheckpointWithFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointWithFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Starting: testHWCheckpointNoFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointNoFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] Test Starting: testHWCheckpointWithFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointWithFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] == core-kafka / kafka.server.LogRecoveryTest ==
[info] 
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] Test Starting: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Passed: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Starting: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Passed: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Starting: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] Test Passed: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] 
[info] == core-kafka / kafka.integration.LazyInitProducerTest ==
[info] Test Starting: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testMultiProduce(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testMultiProduce(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[info] == core-kafka / kafka.integration.LazyInitProducerTest ==
[info] 
[info] == core-kafka / kafka.server.ServerShutdownTest ==
[info] Test Starting: testCleanShutdown(kafka.server.ServerShutdownTest)
[info] Test Passed: testCleanShutdown(kafka.server.ServerShutdownTest)
[info] == core-kafka / kafka.server.ServerShutdownTest ==
[info] 
[info] == core-kafka / kafka.server.ReplicaFetchTest ==
[info] Test Starting: testReplicaFetcherThread(kafka.server.ReplicaFetchTest)
[info] Test Passed: testReplicaFetcherThread(kafka.server.ReplicaFetchTest)
[info] == core-kafka / kafka.server.ReplicaFetchTest ==
[info] 
[info] == core-kafka / kafka.log.LogTest ==
[info] Test Starting: testLoadEmptyLog
[info] Test Passed: testLoadEmptyLog
[info] Test Starting: testLoadInvalidLogsFails
[info] Test Passed: testLoadInvalidLogsFails
[info] Test Starting: testAppendAndRead
[info] Test Passed: testAppendAndRead
[info] Test Starting: testReadOutOfRange
[info] Test Passed: testReadOutOfRange
[info] Test Starting: testLogRolls
[info] Test Passed: testLogRolls
[info] Test Starting: testFindSegment
[info] Test Passed: testFindSegment
[info] Test Starting: testEdgeLogRolls
[info] Test Passed: testEdgeLogRolls
[info] == core-kafka / kafka.log.LogTest ==
[info] 
[info] == core-kafka / kafka.log.SegmentListTest ==
[info] Test Starting: testAppend
[info] Test Passed: testAppend
[info] Test Starting: testTrunc
[info] Test Passed: testTrunc
[info] Test Starting: testTruncLast
[info] Test Passed: testTruncLast
[info] Test Starting: testTruncBeyondList
[info] Test Passed: testTruncBeyondList
[info] == core-kafka / kafka.log.SegmentListTest ==
[info] 
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] Test Starting: testSwallow
[info] Test Passed: testSwallow
[info] Test Starting: testCircularIterator
[info] Test Passed: testCircularIterator
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] 
[info] == core-kafka / kafka.message.CompressionUtilTest ==
[info] Test Starting: testSimpleCompressDecompress
[info] Test Passed: testSimpleCompressDecompress
[info] Test Starting: testComplexCompressDecompress
[info] Test Passed: testComplexCompressDecompress
[info] Test Starting: testSnappyCompressDecompressExplicit
java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
	at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
	at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
	at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
	at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:66)
	at kafka.message.SnappyCompression.<init>(CompressionUtils.scala:61)
	at kafka.message.CompressionFactory$.apply(CompressionUtils.scala:82)
	at kafka.message.CompressionUtils$.compress(CompressionUtils.scala:109)
	at kafka.message.CompressionUtilTest.testSnappyCompressDecompressExplicit(CompressionUtilsTest.scala:65)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.junit.internal.runners.TestMethodRunner.executeMethodBody(TestMethodRunner.java:99)
	at org.junit.internal.runners.TestMethodRunner.runUnprotected(TestMethodRunner.java:81)
	at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34)
	at org.junit.internal.runners.TestMethodRunner.runMethod(TestMethodRunner.java:75)
	at org.junit.internal.runners.TestMethodRunner.run(TestMethodRunner.java:45)
	at org.junit.internal.runners.TestClassMethodsRunner.invokeTestMethod(TestClassMethodsRunner.java:71)
	at org.junit.internal.runners.TestClassMethodsRunner.run(TestClassMethodsRunner.java:35)
	at org.junit.internal.runners.TestClassRunner$1.runUnprotected(TestClassRunner.java:42)
	at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34)
	at org.junit.internal.runners.TestClassRunner.run(TestClassRunner.java:52)
	at org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:29)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:121)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:100)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:91)
	at org.scalatest.junit.JUnitSuite$class.run(JUnitSuite.scala:261)
	at kafka.message.CompressionUtilTest.run(CompressionUtilsTest.scala:25)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in java.library.path
	at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
	at java.lang.Runtime.loadLibrary0(Runtime.java:823)
	at java.lang.System.loadLibrary(System.java:1028)
	at org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
	... 53 more
[error] Test Failed: testSnappyCompressDecompressExplicit
org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null
	at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:229)
	at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
	at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
	at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:66)
	at kafka.message.SnappyCompression.<init>(CompressionUtils.scala:61)
	at kafka.message.CompressionFactory$.apply(CompressionUtils.scala:82)
	at kafka.message.CompressionUtils$.compress(CompressionUtils.scala:109)
	at kafka.message.CompressionUtilTest.testSnappyCompressDecompressExplicit(CompressionUtilsTest.scala:65)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.junit.internal.runners.TestMethodRunner.executeMethodBody(TestMethodRunner.java:99)
	at org.junit.internal.runners.TestMethodRunner.runUnprotected(TestMethodRunner.java:81)
	at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34)
	at org.junit.internal.runners.TestMethodRunner.runMethod(TestMethodRunner.java:75)
	at org.junit.internal.runners.TestMethodRunner.run(TestMethodRunner.java:45)
	at org.junit.internal.runners.TestClassMethodsRunner.invokeTestMethod(TestClassMethodsRunner.java:71)
	at org.junit.internal.runners.TestClassMethodsRunner.run(TestClassMethodsRunner.java:35)
	at org.junit.internal.runners.TestClassRunner$1.runUnprotected(TestClassRunner.java:42)
	at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:34)
	at org.junit.internal.runners.TestClassRunner.run(TestClassRunner.java:52)
	at org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:29)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:121)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:100)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:91)
	at org.scalatest.junit.JUnitSuite$class.run(JUnitSuite.scala:261)
	at kafka.message.CompressionUtilTest.run(CompressionUtilsTest.scala:25)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] == core-kafka / kafka.message.CompressionUtilTest ==
[info] 
[info] == core-kafka / kafka.message.MessageTest ==
[info] Test Starting: testFieldValues
[info] Test Passed: testFieldValues
[info] Test Starting: testChecksum
[info] Test Passed: testChecksum
[info] Test Starting: testEquality
[info] Test Passed: testEquality
[info] Test Starting: testIsHashable
[info] Test Passed: testIsHashable
[info] == core-kafka / kafka.message.MessageTest ==
[info] 
[info] == core-kafka / kafka.server.HighwatermarkPersistenceTest ==
[info] Test Starting: testHighWatermarkPersistenceSinglePartition(kafka.server.HighwatermarkPersistenceTest)
[info] Test Passed: testHighWatermarkPersistenceSinglePartition(kafka.server.HighwatermarkPersistenceTest)
[info] Test Starting: testHighWatermarkPersistenceMultiplePartitions(kafka.server.HighwatermarkPersistenceTest)
[info] Test Passed: testHighWatermarkPersistenceMultiplePartitions(kafka.server.HighwatermarkPersistenceTest)
[info] == core-kafka / kafka.server.HighwatermarkPersistenceTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 137, Failed 3, Errors 0, Passed 134, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /var/tmp/sbt_510e5050
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[info] 
[info] == perf / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == perf / test-compile ==
[error] Error running kafka.log.LogCorruptionTest: Test FAILED
[error] Error running kafka.log.LogOffsetTest: Test FAILED
[error] Error running kafka.message.CompressionUtilTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 441 s, completed Aug 15, 2012 4:16:29 PM
[info] 
[info] Total session time: 443 s, completed Aug 15, 2012 4:16:29 PM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : Kafka-0.8 #26

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-0.8/26/changes>


Build failed in Jenkins: Kafka-0.8 #25

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-0.8/25/changes>

Changes:

[junrao] failed ERROR messages in LazyInitProducerTest; patched by Yang Ye; reviewed by Jun Rao; kafka-467

------------------------------------------
[...truncated 1026 lines...]
[info] Test Starting: testProduceAfterClosed(kafka.producer.AsyncProducerTest)
[info] Test Passed: testProduceAfterClosed(kafka.producer.AsyncProducerTest)
[info] Test Starting: testBatchSize(kafka.producer.AsyncProducerTest)
[info] Test Passed: testBatchSize(kafka.producer.AsyncProducerTest)
[info] Test Starting: testQueueTimeExpired(kafka.producer.AsyncProducerTest)
[info] Test Passed: testQueueTimeExpired(kafka.producer.AsyncProducerTest)
[info] Test Starting: testPartitionAndCollateEvents(kafka.producer.AsyncProducerTest)
[info] Test Passed: testPartitionAndCollateEvents(kafka.producer.AsyncProducerTest)
[info] Test Starting: testSerializeEvents(kafka.producer.AsyncProducerTest)
[info] Test Passed: testSerializeEvents(kafka.producer.AsyncProducerTest)
[info] Test Starting: testInvalidPartition(kafka.producer.AsyncProducerTest)
[info] Test Passed: testInvalidPartition(kafka.producer.AsyncProducerTest)
[info] Test Starting: testNoBroker(kafka.producer.AsyncProducerTest)
[info] Test Passed: testNoBroker(kafka.producer.AsyncProducerTest)
[info] Test Starting: testIncompatibleEncoder(kafka.producer.AsyncProducerTest)
[info] Test Passed: testIncompatibleEncoder(kafka.producer.AsyncProducerTest)
[info] Test Starting: testRandomPartitioner(kafka.producer.AsyncProducerTest)
[info] Test Passed: testRandomPartitioner(kafka.producer.AsyncProducerTest)
[info] Test Starting: testBrokerListAndAsync(kafka.producer.AsyncProducerTest)
[info] Test Passed: testBrokerListAndAsync(kafka.producer.AsyncProducerTest)
[info] Test Starting: testFailedSendRetryLogic(kafka.producer.AsyncProducerTest)
[info] Test Passed: testFailedSendRetryLogic(kafka.producer.AsyncProducerTest)
[info] Test Starting: testJavaProducer(kafka.producer.AsyncProducerTest)
[info] Test Passed: testJavaProducer(kafka.producer.AsyncProducerTest)
[info] Test Starting: testInvalidConfiguration(kafka.producer.AsyncProducerTest)
[info] Test Passed: testInvalidConfiguration(kafka.producer.AsyncProducerTest)
[info] == core-kafka / kafka.producer.AsyncProducerTest ==
[info] 
[info] == core-kafka / kafka.integration.BackwardsCompatibilityTest ==
[info] Test Starting: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)
[info] Test Passed: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)
[info] == core-kafka / kafka.integration.BackwardsCompatibilityTest ==
[info] 
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] Test Starting: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Passed: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Starting: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Passed: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Starting: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] Test Passed: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] 
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] Test Starting: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] Test Passed: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] 
[info] == core-kafka / kafka.log.LogTest ==
[info] Test Starting: testLoadEmptyLog
[info] Test Passed: testLoadEmptyLog
[info] Test Starting: testLoadInvalidLogsFails
[info] Test Passed: testLoadInvalidLogsFails
[info] Test Starting: testAppendAndRead
[info] Test Passed: testAppendAndRead
[info] Test Starting: testReadOutOfRange
[info] Test Passed: testReadOutOfRange
[info] Test Starting: testLogRolls
[info] Test Passed: testLogRolls
[info] Test Starting: testFindSegment
[info] Test Passed: testFindSegment
[info] Test Starting: testEdgeLogRolls
[info] Test Passed: testEdgeLogRolls
[info] == core-kafka / kafka.log.LogTest ==
[info] 
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==
[info] Test Starting: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==
[info] 
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testSmallFetchSize
[info] Test Passed: testSmallFetchSize
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testIterator
[info] Test Passed: testIterator
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] Test Starting: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[info] Test Passed: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] 
[info] == core-kafka / unit.kafka.metrics.KafkaTimerTest ==
[info] Test Starting: testKafkaTimer(unit.kafka.metrics.KafkaTimerTest)
[info] Test Passed: testKafkaTimer(unit.kafka.metrics.KafkaTimerTest)
[info] == core-kafka / unit.kafka.metrics.KafkaTimerTest ==
[info] 
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] Test Starting: testResetToEarliestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToEarliestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testResetToEarliestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToEarliestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testResetToLatestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToLatestWhenOffsetTooHigh(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testResetToLatestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testResetToLatestWhenOffsetTooLow(kafka.integration.AutoOffsetResetTest)
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] 
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] Test Starting: simpleRequest
[info] Test Passed: simpleRequest
[info] Test Starting: tooBigRequestIsRejected
[info] Test Passed: tooBigRequestIsRejected
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] 
[info] == core-kafka / kafka.consumer.ZookeeperConsumerConnectorTest ==
[info] Test Starting: testBasic(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testCompression(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testCompression(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testCompressionSetConsumption(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testCompressionSetConsumption(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testConsumerDecoder(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testConsumerDecoder(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testLeaderSelectionForPartition(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testLeaderSelectionForPartition(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.consumer.ZookeeperConsumerConnectorTest ==
[info] 
[info] == core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testIteratorIsConsistentWithCompression
[info] Test Passed: testIteratorIsConsistentWithCompression
[info] Test Starting: testSizeInBytesWithCompression
[info] Test Passed: testSizeInBytesWithCompression
[info] Test Starting: testValidBytesWithCompression
[info] Test Passed: testValidBytesWithCompression
[info] Test Starting: testEqualsWithCompression
[info] Test Passed: testEqualsWithCompression
[info] == core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.message.MessageTest ==
[info] Test Starting: testFieldValues
[info] Test Passed: testFieldValues
[info] Test Starting: testChecksum
[info] Test Passed: testChecksum
[info] Test Starting: testEquality
[info] Test Passed: testEquality
[info] Test Starting: testIsHashable
[info] Test Passed: testIsHashable
[info] == core-kafka / kafka.message.MessageTest ==
[info] 
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] Test Starting: testUpdateBrokerPartitionInfo(kafka.producer.ProducerTest)
[info] Test Passed: testUpdateBrokerPartitionInfo(kafka.producer.ProducerTest)
[info] Test Starting: testSendToNewTopic(kafka.producer.ProducerTest)
[error] Test Failed: testSendToNewTopic(kafka.producer.ProducerTest)
java.lang.AssertionError: Message set should not have any more messages
	at org.junit.Assert.fail(Assert.java:69)
	at org.junit.Assert.assertTrue(Assert.java:32)
	at org.junit.Assert.assertFalse(Assert.java:51)
	at kafka.producer.ProducerTest.testSendToNewTopic(ProducerTest.scala:183)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] Test Starting: testSendWithDeadBroker(kafka.producer.ProducerTest)
[info] Test Passed: testSendWithDeadBroker(kafka.producer.ProducerTest)
[info] Test Starting: testAsyncSendCanCorrectlyFailWithTimeout(kafka.producer.ProducerTest)
[info] Test Passed: testAsyncSendCanCorrectlyFailWithTimeout(kafka.producer.ProducerTest)
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] 
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testFileSize
[info] Test Passed: testFileSize
[info] Test Starting: testIterationOverPartialAndTruncation
[info] Test Passed: testIterationOverPartialAndTruncation
[info] Test Starting: testIterationDoesntChangePosition
[info] Test Passed: testIterationDoesntChangePosition
[info] Test Starting: testRead
[info] Test Passed: testRead
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 140, Failed 1, Errors 0, Passed 139, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_284e8f86
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.producer.ProducerTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 202 s, completed Aug 18, 2012 6:05:19 AM
[info] 
[info] Total session time: 203 s, completed Aug 18, 2012 6:05:19 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Kafka-0.8 #24

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-0.8/24/changes>

Changes:

[junrao] KafkaController NPE in SessionExpireListener; patched by Yang Ye; reviewed by Jun Rao, Neha Narkhede; KAFKA-464

------------------------------------------
[...truncated 5638 lines...]
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[2012-08-18 00:41:51,849] WARN Session 0x139372cb9020003 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1071)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1104)
[2012-08-18 00:41:51,849] WARN Session 0x139372d09690002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1071)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1104)
[2012-08-18 00:41:51,848] WARN Session 0x139372b1d270002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1071)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1104)
[2012-08-18 00:41:51,848] WARN Session 0x139372d123f0002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1071)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1104)
[2012-08-18 00:41:51,848] WARN Session 0x139372d08910002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:1071)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1104)
[2012-08-18 00:41:51,868] ERROR Connection attempt to localhost:47213 failed, next attempt in 100 ms (kafka.producer.SyncProducer:99)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at kafka.network.BlockingChannel.connect(BlockingChannel.scala:57)
	at kafka.producer.SyncProducer.connect(SyncProducer.scala:161)
	at kafka.producer.SyncProducer.getOrMakeConnection(SyncProducer.scala:182)
	at kafka.producer.SyncProducer.doSend(SyncProducer.scala:74)
	at kafka.producer.SyncProducer.send(SyncProducer.scala:116)
	at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:86)
	at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:53)
	at kafka.utils.Utils$.swallow(Utils.scala:415)
	at kafka.utils.Logging$class.swallowError(Logging.scala:102)
	at kafka.utils.Utils$.swallowError(Utils.scala:40)
	at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:53)
	at kafka.producer.AsyncProducerTest.testFailedSendRetryLogic(AsyncProducerTest.scala:438)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[2012-08-18 00:41:51,978] ERROR Producer connection to localhost:47213 timing out after 5000 ms (kafka.producer.SyncProducer:99)
java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at kafka.network.BlockingChannel.connect(BlockingChannel.scala:57)
	at kafka.producer.SyncProducer.connect(SyncProducer.scala:161)
	at kafka.producer.SyncProducer.getOrMakeConnection(SyncProducer.scala:182)
	at kafka.producer.SyncProducer.doSend(SyncProducer.scala:74)
	at kafka.producer.SyncProducer.send(SyncProducer.scala:116)
	at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:86)
	at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:53)
	at kafka.utils.Utils$.swallow(Utils.scala:415)
	at kafka.utils.Logging$class.swallowError(Logging.scala:102)
	at kafka.utils.Utils$.swallowError(Utils.scala:40)
	at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:53)
	at kafka.producer.AsyncProducerTest.testFailedSendRetryLogic(AsyncProducerTest.scala:438)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[2012-08-18 00:41:51,979] ERROR fetching broker partition metadata for topics [ListBuffer(topic1)] from broker [ArrayBuffer(id:0,creatorId:localhost-1345250491862,host:localhost,port:47213)] failed (kafka.utils.Utils$:102)
kafka.common.KafkaException: fetching broker partition metadata for topics [ListBuffer(topic1)] from broker [ArrayBuffer(id:0,creatorId:localhost-1345250491862,host:localhost,port:47213)] failed
	at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:112)
	at kafka.producer.async.DefaultEventHandler$$anonfun$handle$1.apply$mcV$sp(DefaultEventHandler.scala:53)
	at kafka.utils.Utils$.swallow(Utils.scala:415)
	at kafka.utils.Logging$class.swallowError(Logging.scala:102)
	at kafka.utils.Utils$.swallowError(Utils.scala:40)
	at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:53)
	at kafka.producer.AsyncProducerTest.testFailedSendRetryLogic(AsyncProducerTest.scala:438)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.Net.connect(Native Method)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:507)
	at kafka.network.BlockingChannel.connect(BlockingChannel.scala:57)
	at kafka.producer.SyncProducer.connect(SyncProducer.scala:161)
	at kafka.producer.SyncProducer.getOrMakeConnection(SyncProducer.scala:182)
	at kafka.producer.SyncProducer.doSend(SyncProducer.scala:74)
	at kafka.producer.SyncProducer.send(SyncProducer.scala:116)
	at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:86)
	... 41 more
[info] Test Passed: testFailedSendRetryLogic(kafka.producer.AsyncProducerTest)
[info] Test Starting: testJavaProducer(kafka.producer.AsyncProducerTest)
[info] Test Passed: testJavaProducer(kafka.producer.AsyncProducerTest)
[info] Test Starting: testInvalidConfiguration(kafka.producer.AsyncProducerTest)
[info] Test Passed: testInvalidConfiguration(kafka.producer.AsyncProducerTest)
[info] == core-kafka / kafka.producer.AsyncProducerTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Starting: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Passed: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] Test Starting: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[info] Test Passed: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /var/tmp/sbt_f0ab0172
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 140, Failed 2, Errors 0, Passed 138, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.message.CompressionUtilTest: Test FAILED
[error] Error running kafka.producer.AsyncProducerTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 289 s, completed Aug 18, 2012 12:41:57 AM
[info] 
[info] Total session time: 290 s, completed Aug 18, 2012 12:41:57 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Re: Build failed in Jenkins: Kafka-0.8 #23

Posted by Jay Kreps <ja...@gmail.com>.
It looks like these problems come from the tests not being reentrant due to
using hard-coded ports. Hard-coding ports is a problem because our tests
may run at the same time for different branches or may run on the same test
server as other tests that are using that port.

There is not reason to use a hard-coded port as we have a utility function
that will find a set of free ports for you:
  TestUtils.choosePort    // to get one free port
or
  TestUtils.choosePorts(5)   // to get 5 free ports

-Jay

On Fri, Aug 17, 2012 at 3:08 PM, Apache Jenkins Server <
jenkins@builds.apache.org> wrote:

> See <https://builds.apache.org/job/Kafka-0.8/23/changes>
>
> Changes:
>
> [junrao] enforce broker.id to be a non-negative integer; patched by
> Swapnil Ghike; reviewed by Jun Rao, Neha Narkhede; KAFKA-424
>
> ------------------------------------------
> [...truncated 3791 lines...]
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
>         at
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
>         at sbt.TestRunner.run(TestFramework.scala:53)
>         at sbt.TestRunner.runTest$1(TestFramework.scala:67)
>         at sbt.TestRunner.run(TestFramework.scala:76)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at sbt.NamedTestTask.run(TestFramework.scala:92)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
>         at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
>         at sbt.impl.RunTask.runTask(RunTask.scala:85)
>         at
> sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at sbt.Control$.trapUnit(Control.scala:19)
>         at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
>  [0m[ [0minfo [0m]  [0mTest Starting:
> testMultiProduce(kafka.integration.LazyInitProducerTest) [0m
>  [0m[ [31merror [0m]  [0mTest Failed:
> testMultiProduce(kafka.integration.LazyInitProducerTest) [0m
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind(Native Method)
>         at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
>         at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
>         at
> kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
>         at
> kafka.integration.LazyInitProducerTest.kafka$integration$KafkaServerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
>         at
> kafka.integration.KafkaServerTestHarness$class.setUp(KafkaServerTestHarness.scala:35)
>         at
> kafka.integration.LazyInitProducerTest.kafka$integration$ProducerConsumerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
>         at
> kafka.integration.ProducerConsumerTestHarness$class.setUp(ProducerConsumerTestHarness.scala:34)
>         at
> kafka.integration.LazyInitProducerTest.setUp(LazyInitProducerTest.scala:42)
>         at junit.framework.TestCase.runBare(TestCase.java:128)
>         at junit.framework.TestResult$1.protect(TestResult.java:110)
>         at junit.framework.TestResult.runProtected(TestResult.java:128)
>         at junit.framework.TestResult.run(TestResult.java:113)
>         at junit.framework.TestCase.run(TestCase.java:120)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
>         at
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
>         at sbt.TestRunner.run(TestFramework.scala:53)
>         at sbt.TestRunner.runTest$1(TestFramework.scala:67)
>         at sbt.TestRunner.run(TestFramework.scala:76)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at sbt.NamedTestTask.run(TestFramework.scala:92)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
>         at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
>         at sbt.impl.RunTask.runTask(RunTask.scala:85)
>         at
> sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at sbt.Control$.trapUnit(Control.scala:19)
>         at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
>  [0m[ [0minfo [0m]  [0mTest Starting:
> testProduceAndFetch(kafka.integration.LazyInitProducerTest) [0m
>  [0m[ [31merror [0m]  [0mTest Failed:
> testProduceAndFetch(kafka.integration.LazyInitProducerTest) [0m
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind(Native Method)
>         at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
>         at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
>         at
> kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
>         at
> kafka.integration.LazyInitProducerTest.kafka$integration$KafkaServerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
>         at
> kafka.integration.KafkaServerTestHarness$class.setUp(KafkaServerTestHarness.scala:35)
>         at
> kafka.integration.LazyInitProducerTest.kafka$integration$ProducerConsumerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
>         at
> kafka.integration.ProducerConsumerTestHarness$class.setUp(ProducerConsumerTestHarness.scala:34)
>         at
> kafka.integration.LazyInitProducerTest.setUp(LazyInitProducerTest.scala:42)
>         at junit.framework.TestCase.runBare(TestCase.java:128)
>         at junit.framework.TestResult$1.protect(TestResult.java:110)
>         at junit.framework.TestResult.runProtected(TestResult.java:128)
>         at junit.framework.TestResult.run(TestResult.java:113)
>         at junit.framework.TestCase.run(TestCase.java:120)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
>         at
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
>         at sbt.TestRunner.run(TestFramework.scala:53)
>         at sbt.TestRunner.runTest$1(TestFramework.scala:67)
>         at sbt.TestRunner.run(TestFramework.scala:76)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at sbt.NamedTestTask.run(TestFramework.scala:92)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
>         at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
>         at sbt.impl.RunTask.runTask(RunTask.scala:85)
>         at
> sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at sbt.Control$.trapUnit(Control.scala:19)
>         at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
>  [0m[ [0minfo [0m]  [0mTest Starting:
> testMultiProduceResend(kafka.integration.LazyInitProducerTest) [0m
>  [0m[ [31merror [0m]  [0mTest Failed:
> testMultiProduceResend(kafka.integration.LazyInitProducerTest) [0m
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind(Native Method)
>         at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
>         at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
>         at
> kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
>         at
> kafka.integration.LazyInitProducerTest.kafka$integration$KafkaServerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
>         at
> kafka.integration.KafkaServerTestHarness$class.setUp(KafkaServerTestHarness.scala:35)
>         at
> kafka.integration.LazyInitProducerTest.kafka$integration$ProducerConsumerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
>         at
> kafka.integration.ProducerConsumerTestHarness$class.setUp(ProducerConsumerTestHarness.scala:34)
>         at
> kafka.integration.LazyInitProducerTest.setUp(LazyInitProducerTest.scala:42)
>         at junit.framework.TestCase.runBare(TestCase.java:128)
>         at junit.framework.TestResult$1.protect(TestResult.java:110)
>         at junit.framework.TestResult.runProtected(TestResult.java:128)
>         at junit.framework.TestResult.run(TestResult.java:113)
>         at junit.framework.TestCase.run(TestCase.java:120)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
>         at
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
>         at sbt.TestRunner.run(TestFramework.scala:53)
>         at sbt.TestRunner.runTest$1(TestFramework.scala:67)
>         at sbt.TestRunner.run(TestFramework.scala:76)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at sbt.NamedTestTask.run(TestFramework.scala:92)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
>         at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
>         at sbt.impl.RunTask.runTask(RunTask.scala:85)
>         at
> sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at sbt.Control$.trapUnit(Control.scala:19)
>         at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
>  [0m[ [0minfo [0m]  [34m== core-kafka /
> kafka.integration.LazyInitProducerTest == [0m
>  [0m[ [0minfo [0m]  [34m [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / kafka.server.LeaderElectionTest ==
> [0m
>  [0m[ [0minfo [0m]  [0mTest Starting:
> testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest) [0m
>  [0m[ [31merror [0m]  [0mTest Failed:
> testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest) [0m
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind(Native Method)
>         at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
>         at
> org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
>         at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
>         at
> kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
>         at
> kafka.server.LeaderElectionTest.setUp(LeaderElectionTest.scala:39)
>         at junit.framework.TestCase.runBare(TestCase.java:128)
>         at junit.framework.TestResult$1.protect(TestResult.java:110)
>         at junit.framework.TestResult.runProtected(TestResult.java:128)
>         at junit.framework.TestResult.run(TestResult.java:113)
>         at junit.framework.TestCase.run(TestCase.java:120)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at junit.framework.TestSuite.runTest(TestSuite.java:228)
>         at junit.framework.TestSuite.run(TestSuite.java:223)
>         at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
>         at
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
>         at sbt.TestRunner.run(TestFramework.scala:53)
>         at sbt.TestRunner.runTest$1(TestFramework.scala:67)
>         at sbt.TestRunner.run(TestFramework.scala:76)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at
> sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
>         at sbt.NamedTestTask.run(TestFramework.scala:92)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at
> sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
>         at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
>         at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
>         at sbt.impl.RunTask.runTask(RunTask.scala:85)
>         at
> sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at
> sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
>         at sbt.Control$.trapUnit(Control.scala:19)
>         at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
>  [0m[ [0minfo [0m]  [34m== core-kafka / kafka.server.LeaderElectionTest ==
> [0m
>  [0m[ [0minfo [0m]  [34m [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / kafka.network.SocketServerTest ==
> [0m
>  [0m[ [0minfo [0m]  [0mTest Starting: simpleRequest [0m
>  [0m[ [0minfo [0m]  [0mTest Passed: simpleRequest [0m
>  [0m[ [0minfo [0m]  [0mTest Starting: tooBigRequestIsRejected [0m
>  [0m[ [0minfo [0m]  [0mTest Passed: tooBigRequestIsRejected [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / kafka.network.SocketServerTest ==
> [0m
>  [0m[ [0minfo [0m]  [34m [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / test-finish == [0m
>  [0m[ [31merror [0m]  [0mFailed: : Total 140, Failed 61, Errors 0, Passed
> 79, Skipped 0 [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / test-finish == [0m
>  [0m[ [0minfo [0m]  [34m [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / Test cleanup 1 == [0m
>  [0m[ [0minfo [0m]  [0mDeleting directory /tmp/sbt_c45e98f8 [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / Test cleanup 1 == [0m
>  [0m[ [0minfo [0m]  [34m [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / test-cleanup == [0m
>  [0m[ [0minfo [0m]  [34m== core-kafka / test-cleanup == [0m
>  [0m[ [31merror [0m]  [0mError running kafka.zk.ZKEphemeralTest: Test
> FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.server.ReplicaFetchTest: Test
> FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.server.LeaderElectionTest:
> Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.javaapi.consumer.ZookeeperConsumerConnectorTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.server.ServerShutdownTest:
> Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.producer.SyncProducerTest:
> Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running test: One or more subtasks failed
> [0m
>  [0m[ [31merror [0m]  [0mError running kafka.server.LogRecoveryTest: Test
> FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.integration.LazyInitProducerTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.integration.AutoOffsetResetTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.admin.AdminTest: Test FAILED
> [0m
>  [0m[ [31merror [0m]  [0mError running kafka.log.LogOffsetTest: Test
> FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.log.LogCorruptionTest: Test
> FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.integration.TopicMetadataTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.log.LogManagerTest: Test
> FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.consumer.ZookeeperConsumerConnectorTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.integration.BackwardsCompatibilityTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running
> kafka.controller.ControllerBasicTest: Test FAILED [0m
>  [0m[ [31merror [0m]  [0mError running kafka.integration.PrimitiveApiTest:
> Test FAILED [0m
>  [0m[ [0minfo [0m]  [0m [0m
>  [0m[ [0minfo [0m]  [0mTotal time: 109 s, completed Aug 17, 2012 10:08:41
> PM [0m
>  [0m[ [0minfo [0m]  [0m [0m
>  [0m[ [0minfo [0m]  [0mTotal session time: 109 s, completed Aug 17, 2012
> 10:08:41 PM [0m
>  [0m[ [31merror [0m]  [0mError during build. [0m
> Build step 'Execute shell' marked build as failure
>

Build failed in Jenkins: Kafka-0.8 #23

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-0.8/23/changes>

Changes:

[junrao] enforce broker.id to be a non-negative integer; patched by Swapnil Ghike; reviewed by Jun Rao, Neha Narkhede; KAFKA-424

------------------------------------------
[...truncated 3791 lines...]
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] Test Starting: testMultiProduce(kafka.integration.LazyInitProducerTest)
[error] Test Failed: testMultiProduce(kafka.integration.LazyInitProducerTest)
java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind(Native Method)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
	at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
	at kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
	at kafka.integration.LazyInitProducerTest.kafka$integration$KafkaServerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
	at kafka.integration.KafkaServerTestHarness$class.setUp(KafkaServerTestHarness.scala:35)
	at kafka.integration.LazyInitProducerTest.kafka$integration$ProducerConsumerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
	at kafka.integration.ProducerConsumerTestHarness$class.setUp(ProducerConsumerTestHarness.scala:34)
	at kafka.integration.LazyInitProducerTest.setUp(LazyInitProducerTest.scala:42)
	at junit.framework.TestCase.runBare(TestCase.java:128)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] Test Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[error] Test Failed: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind(Native Method)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
	at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
	at kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
	at kafka.integration.LazyInitProducerTest.kafka$integration$KafkaServerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
	at kafka.integration.KafkaServerTestHarness$class.setUp(KafkaServerTestHarness.scala:35)
	at kafka.integration.LazyInitProducerTest.kafka$integration$ProducerConsumerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
	at kafka.integration.ProducerConsumerTestHarness$class.setUp(ProducerConsumerTestHarness.scala:34)
	at kafka.integration.LazyInitProducerTest.setUp(LazyInitProducerTest.scala:42)
	at junit.framework.TestCase.runBare(TestCase.java:128)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] Test Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[error] Test Failed: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind(Native Method)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
	at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
	at kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
	at kafka.integration.LazyInitProducerTest.kafka$integration$KafkaServerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
	at kafka.integration.KafkaServerTestHarness$class.setUp(KafkaServerTestHarness.scala:35)
	at kafka.integration.LazyInitProducerTest.kafka$integration$ProducerConsumerTestHarness$$super$setUp(LazyInitProducerTest.scala:33)
	at kafka.integration.ProducerConsumerTestHarness$class.setUp(ProducerConsumerTestHarness.scala:34)
	at kafka.integration.LazyInitProducerTest.setUp(LazyInitProducerTest.scala:42)
	at junit.framework.TestCase.runBare(TestCase.java:128)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] == core-kafka / kafka.integration.LazyInitProducerTest ==
[info] 
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] Test Starting: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[error] Test Failed: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind(Native Method)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:52)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:144)
	at org.apache.zookeeper.server.NIOServerCnxn$Factory.<init>(NIOServerCnxn.java:125)
	at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:32)
	at kafka.zk.ZooKeeperTestHarness$class.setUp(ZooKeeperTestHarness.scala:32)
	at kafka.server.LeaderElectionTest.setUp(LeaderElectionTest.scala:39)
	at junit.framework.TestCase.runBare(TestCase.java:128)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] 
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] Test Starting: simpleRequest
[info] Test Passed: simpleRequest
[info] Test Starting: tooBigRequestIsRejected
[info] Test Passed: tooBigRequestIsRejected
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 140, Failed 61, Errors 0, Passed 79, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_c45e98f8
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.zk.ZKEphemeralTest: Test FAILED
[error] Error running kafka.server.ReplicaFetchTest: Test FAILED
[error] Error running kafka.server.LeaderElectionTest: Test FAILED
[error] Error running kafka.javaapi.consumer.ZookeeperConsumerConnectorTest: Test FAILED
[error] Error running kafka.server.ServerShutdownTest: Test FAILED
[error] Error running kafka.producer.SyncProducerTest: Test FAILED
[error] Error running test: One or more subtasks failed
[error] Error running kafka.server.LogRecoveryTest: Test FAILED
[error] Error running kafka.integration.LazyInitProducerTest: Test FAILED
[error] Error running kafka.integration.AutoOffsetResetTest: Test FAILED
[error] Error running kafka.admin.AdminTest: Test FAILED
[error] Error running kafka.log.LogOffsetTest: Test FAILED
[error] Error running kafka.log.LogCorruptionTest: Test FAILED
[error] Error running kafka.integration.TopicMetadataTest: Test FAILED
[error] Error running kafka.log.LogManagerTest: Test FAILED
[error] Error running kafka.consumer.ZookeeperConsumerConnectorTest: Test FAILED
[error] Error running kafka.integration.BackwardsCompatibilityTest: Test FAILED
[error] Error running kafka.controller.ControllerBasicTest: Test FAILED
[error] Error running kafka.integration.PrimitiveApiTest: Test FAILED
[info] 
[info] Total time: 109 s, completed Aug 17, 2012 10:08:41 PM
[info] 
[info] Total session time: 109 s, completed Aug 17, 2012 10:08:41 PM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Kafka-0.8 #22

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-0.8/22/changes>

Changes:

[jjkoshy] KAFKA-385 Fix race condition between checkSatisfied and expire in RequestPurgatory; fixed constant expiration of follower fetch requests as checkSatisfied was not getting called; add metrics to the RequestPurgatory; add a KafkaTimer convenience class; patched by Joel Koshy; reviewed by Jun Rao and Jay Kreps.

------------------------------------------
[...truncated 4348 lines...]
[info] Test Starting: testLoadInvalidLogsFails
[info] Test Passed: testLoadInvalidLogsFails
[info] Test Starting: testAppendAndRead
[info] Test Passed: testAppendAndRead
[info] Test Starting: testReadOutOfRange
[info] Test Passed: testReadOutOfRange
[info] Test Starting: testLogRolls
[info] Test Passed: testLogRolls
[info] Test Starting: testFindSegment
[info] Test Passed: testFindSegment
[info] Test Starting: testEdgeLogRolls
[info] Test Passed: testEdgeLogRolls
[info] == core-kafka / kafka.log.LogTest ==
[info] 
[info] == core-kafka / kafka.integration.BackwardsCompatibilityTest ==
[info] Test Starting: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)
[2012-08-16 22:19:29,705] WARN Exception causing close of session 0x0 due to java.io.IOException: ZooKeeperServer not running (org.apache.zookeeper.server.NIOServerCnxn:639)
[info] Test Passed: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)
[info] == core-kafka / kafka.integration.BackwardsCompatibilityTest ==
[info] 
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==
[info] Test Starting: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[2012-08-16 22:19:31,601] ERROR Closing socket for /127.0.0.1 because of error (kafka.network.Processor:99)
java.io.IOException: Connection reset by peer
	at sun.nio.ch.FileDispatcher.read0(Native Method)
	at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:21)
	at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:198)
	at sun.nio.ch.IOUtil.read(IOUtil.java:171)
	at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:243)
	at kafka.utils.Utils$.read(Utils.scala:631)
	at kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
	at kafka.network.Processor.read(SocketServer.scala:296)
	at kafka.network.Processor.run(SocketServer.scala:212)
	at java.lang.Thread.run(Thread.java:662)
[info] Test Passed: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==
[info] 
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] Test Starting: testGetOffsetsForUnknownTopic(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsForUnknownTopic(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeLatestTime(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeLatestTime(kafka.log.LogOffsetTest)
[info] Test Starting: testEmptyLogsGetOffsets(kafka.log.LogOffsetTest)
[info] Test Passed: testEmptyLogsGetOffsets(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeNow(kafka.log.LogOffsetTest)
Offsets = 240,216,108,0
[info] Test Passed: testGetOffsetsBeforeNow(kafka.log.LogOffsetTest)
[info] Test Starting: testGetOffsetsBeforeEarliestTime(kafka.log.LogOffsetTest)
[info] Test Passed: testGetOffsetsBeforeEarliestTime(kafka.log.LogOffsetTest)
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] 
[info] == core-kafka / kafka.consumer.FetcherTest ==
[info] Test Starting: testFetcher(kafka.consumer.FetcherTest)
[info] Test Passed: testFetcher(kafka.consumer.FetcherTest)
[info] == core-kafka / kafka.consumer.FetcherTest ==
[info] 
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] Test Starting: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Passed: testTopicMetadataRequest(kafka.integration.TopicMetadataTest)
[info] Test Starting: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Passed: testBasicTopicMetadata(kafka.integration.TopicMetadataTest)
[info] Test Starting: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] Test Passed: testAutoCreateTopic(kafka.integration.TopicMetadataTest)
[info] == core-kafka / kafka.integration.TopicMetadataTest ==
[info] 
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] Test Starting: simpleRequest
[info] Test Passed: simpleRequest
[info] Test Starting: tooBigRequestIsRejected
[info] Test Passed: tooBigRequestIsRejected
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Starting: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] Test Passed: testLog4jAppends(kafka.log4j.KafkaLog4jAppenderTest)
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.server.LogRecoveryTest ==
[info] Test Starting: testHWCheckpointNoFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointNoFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Starting: testHWCheckpointWithFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointWithFailuresSingleLogSegment(kafka.server.LogRecoveryTest)
[info] Test Starting: testHWCheckpointNoFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointNoFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] Test Starting: testHWCheckpointWithFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] Test Passed: testHWCheckpointWithFailuresMultipleLogSegments(kafka.server.LogRecoveryTest)
[info] == core-kafka / kafka.server.LogRecoveryTest ==
[info] 
[info] == core-kafka / kafka.message.MessageTest ==
[info] Test Starting: testFieldValues
[info] Test Passed: testFieldValues
[info] Test Starting: testChecksum
[info] Test Passed: testChecksum
[info] Test Starting: testEquality
[info] Test Passed: testEquality
[info] Test Starting: testIsHashable
[info] Test Passed: testIsHashable
[info] == core-kafka / kafka.message.MessageTest ==
[info] 
[info] == core-kafka / kafka.consumer.ZookeeperConsumerConnectorTest ==
[info] Test Starting: testBasic(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testCompression(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testCompression(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testCompressionSetConsumption(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testCompressionSetConsumption(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testConsumerDecoder(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testConsumerDecoder(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Starting: testLeaderSelectionForPartition(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testLeaderSelectionForPartition(kafka.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.consumer.ZookeeperConsumerConnectorTest ==
[info] 
[info] == core-kafka / kafka.server.HighwatermarkPersistenceTest ==
[info] Test Starting: testHighWatermarkPersistenceSinglePartition(kafka.server.HighwatermarkPersistenceTest)
[info] Test Passed: testHighWatermarkPersistenceSinglePartition(kafka.server.HighwatermarkPersistenceTest)
[info] Test Starting: testHighWatermarkPersistenceMultiplePartitions(kafka.server.HighwatermarkPersistenceTest)
[info] Test Passed: testHighWatermarkPersistenceMultiplePartitions(kafka.server.HighwatermarkPersistenceTest)
[info] == core-kafka / kafka.server.HighwatermarkPersistenceTest ==
[info] 
[info] == core-kafka / kafka.log.LogManagerTest ==
[info] Test Starting: testCreateLog(kafka.log.LogManagerTest)
[info] Test Passed: testCreateLog(kafka.log.LogManagerTest)
[info] Test Starting: testGetLog(kafka.log.LogManagerTest)
[info] Test Passed: testGetLog(kafka.log.LogManagerTest)
[info] Test Starting: testCleanupExpiredSegments(kafka.log.LogManagerTest)
[info] Test Passed: testCleanupExpiredSegments(kafka.log.LogManagerTest)
[info] Test Starting: testCleanupSegmentsToMaintainSize(kafka.log.LogManagerTest)
[info] Test Passed: testCleanupSegmentsToMaintainSize(kafka.log.LogManagerTest)
[info] Test Starting: testTimeBasedFlush(kafka.log.LogManagerTest)
[info] Test Passed: testTimeBasedFlush(kafka.log.LogManagerTest)
[info] Test Starting: testConfigurablePartitions(kafka.log.LogManagerTest)
[info] Test Passed: testConfigurablePartitions(kafka.log.LogManagerTest)
[info] == core-kafka / kafka.log.LogManagerTest ==
[info] 
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] Test Starting: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[info] Test Passed: testLeaderElectionAndEpoch(kafka.server.LeaderElectionTest)
[info] == core-kafka / kafka.server.LeaderElectionTest ==
[info] 
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] Test Starting: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] Test Passed: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] 
[info] == core-kafka / kafka.producer.SyncProducerTest ==
[info] Test Starting: testReachableServer(kafka.producer.SyncProducerTest)
[info] Test Passed: testReachableServer(kafka.producer.SyncProducerTest)
[info] Test Starting: testEmptyProduceRequest(kafka.producer.SyncProducerTest)
[info] Test Passed: testEmptyProduceRequest(kafka.producer.SyncProducerTest)
[info] Test Starting: testSingleMessageSizeTooLarge(kafka.producer.SyncProducerTest)
[info] Test Passed: testSingleMessageSizeTooLarge(kafka.producer.SyncProducerTest)
[info] Test Starting: testCompressedMessageSizeTooLarge(kafka.producer.SyncProducerTest)
[info] Test Passed: testCompressedMessageSizeTooLarge(kafka.producer.SyncProducerTest)
[info] Test Starting: testProduceCorrectlyReceivesResponse(kafka.producer.SyncProducerTest)
[info] Test Passed: testProduceCorrectlyReceivesResponse(kafka.producer.SyncProducerTest)
[info] Test Starting: testProducerCanTimeout(kafka.producer.SyncProducerTest)
[info] Test Passed: testProducerCanTimeout(kafka.producer.SyncProducerTest)
[info] Test Starting: testProduceRequestForUnknownTopic(kafka.producer.SyncProducerTest)
[info] Test Passed: testProduceRequestForUnknownTopic(kafka.producer.SyncProducerTest)
[info] == core-kafka / kafka.producer.SyncProducerTest ==
[info] 
[info] == core-kafka / kafka.controller.ControllerBasicTest ==
[info] Test Starting: testControllerFailOver(kafka.controller.ControllerBasicTest)
[info] Test Passed: testControllerFailOver(kafka.controller.ControllerBasicTest)
[info] Test Starting: testControllerCommandSend(kafka.controller.ControllerBasicTest)
[info] Test Passed: testControllerCommandSend(kafka.controller.ControllerBasicTest)
[info] == core-kafka / kafka.controller.ControllerBasicTest ==
[info] 
[info] == core-kafka / unit.kafka.metrics.KafkaTimerTest ==
[info] Test Starting: testKafkaTimer(unit.kafka.metrics.KafkaTimerTest)
[info] Test Passed: testKafkaTimer(unit.kafka.metrics.KafkaTimerTest)
[info] == core-kafka / unit.kafka.metrics.KafkaTimerTest ==
[info] 
[info] == core-kafka / kafka.server.ReplicaFetchTest ==
[info] Test Starting: testReplicaFetcherThread(kafka.server.ReplicaFetchTest)
[info] Test Passed: testReplicaFetcherThread(kafka.server.ReplicaFetchTest)
[info] == core-kafka / kafka.server.ReplicaFetchTest ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_c49d0162
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 140, Failed 1, Errors 0, Passed 139, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[info] 
[info] == java-examples / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == java-examples / test-compile ==
[info] 
[info] == java-examples / copy-test-resources ==
[info] == java-examples / copy-test-resources ==
[info] 
[info] == hadoop consumer / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == hadoop consumer / test-compile ==
[info] 
[info] == hadoop consumer / copy-test-resources ==
[info] == hadoop consumer / copy-test-resources ==
[info] 
[info] == hadoop producer / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == hadoop producer / test-compile ==
[info] 
[info] == hadoop consumer / copy-resources ==
[info] == hadoop consumer / copy-resources ==
[info] 
[info] == hadoop producer / copy-test-resources ==
[info] == hadoop producer / copy-test-resources ==
[info] 
[info] == perf / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == perf / test-compile ==
[info] 
[info] == hadoop producer / copy-resources ==
[info] == hadoop producer / copy-resources ==
[info] 
[info] == perf / copy-resources ==
[info] == perf / copy-resources ==
[info] 
[info] == java-examples / copy-resources ==
[info] == java-examples / copy-resources ==
[error] Error running kafka.producer.AsyncProducerTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 195 s, completed Aug 16, 2012 10:20:24 PM
[info] 
[info] Total session time: 195 s, completed Aug 16, 2012 10:20:24 PM
[error] Error during build.
Build step 'Execute shell' marked build as failure