You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kafka.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2012/08/25 08:08:29 UTC

Build failed in Jenkins: Kafka-trunk #141

See <https://builds.apache.org/job/Kafka-trunk/141/changes>

Changes:

[junrao] Require values in Utils.getTopic* methods to be positive; patched by Swapnil Ghike; reviewed by Jun Rao; KAFKA-481

------------------------------------------
[...truncated 2288 lines...]
[2012-08-25 06:07:52,155] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-25 06:07:52,155] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:52,156] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:52,156] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:52,156] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-25 06:07:52,158] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testPartitionedSendToNewBrokerInExistingTopic
[info] Test Starting: testDefaultPartitioner
[2012-08-25 06:07:53,203] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,204] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:53,204] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,221] INFO Awaiting connections on port 50701 (kafka.network.Acceptor:130)
[2012-08-25 06:07:53,222] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:53,222] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,227] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:67.195.138.60-1345874873222,host:67.195.138.60,port:50701 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,227] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:53,227] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,228] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,228] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:53,229] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,232] INFO Awaiting connections on port 43800 (kafka.network.Acceptor:130)
[2012-08-25 06:07:53,232] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:53,233] INFO Registering broker /brokers/ids/1 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,234] INFO Registering broker /brokers/ids/1 succeeded with id:1,creatorId:67.195.138.60-1345874873233,host:67.195.138.60,port:43800 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,234] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:53,235] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,235] INFO Connected to localhost:50701 for producing (kafka.producer.SyncProducer:61)
[2012-08-25 06:07:53,236] INFO Connected to localhost:43800 for producing (kafka.producer.SyncProducer:61)
[2012-08-25 06:07:53,236] INFO Created log for 'test-topic'-0 (kafka.log.LogManager:61)
[2012-08-25 06:07:53,236] INFO Begin registering broker topic /brokers/topics/test-topic/0 with 4 partitions (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,236] INFO Created log for 'test-topic'-2 (kafka.log.LogManager:61)
[2012-08-25 06:07:53,237] INFO Begin registering broker topic /brokers/topics/test-topic/1 with 4 partitions (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,246] INFO End registering broker topic /brokers/topics/test-topic/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,246] INFO End registering broker topic /brokers/topics/test-topic/1 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,737] INFO Closing all async producers (kafka.producer.ProducerPool:61)
[2012-08-25 06:07:53,738] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,739] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,740] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,740] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,740] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-25 06:07:53,750] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,751] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-25 06:07:53,751] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,752] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-25 06:07:53,752] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:53,752] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-25 06:07:53,753] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testDefaultPartitioner
[info] == core-kafka / kafka.producer.ProducerTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs
[2012-08-25 06:07:54,767] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:54,767] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:54,767] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:54,786] INFO Awaiting connections on port 60023 (kafka.network.Acceptor:130)
[2012-08-25 06:07:54,786] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:54,786] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:54,794] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:67.195.138.60-1345874874787,host:67.195.138.60,port:60023 (kafka.server.KafkaZooKeeper:61)
[2012-08-25 06:07:54,794] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:54,795] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-25 06:07:54,795] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-25 06:07:54,796] INFO starting log cleaner every 300000 ms (kafka.log.LogManager:61)
[2012-08-25 06:07:54,798] INFO Awaiting connections on port 50143 (kafka.network.Acceptor:130)
[2012-08-25 06:07:54,798] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-08-25 06:07:54,798] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-25 06:07:54,799] INFO Kafka server started. (kafka.server.KafkaServer:61)
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (kafka.producer.ProducerPool).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs
[info] Test Starting: testBrokerListLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testBrokerListLog4jAppends
[info] Test Starting: testZkConnectLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testZkConnectLog4jAppends
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.producer.AsyncProducerTest ==
[info] Test Starting: testProducerQueueSize
Queue is full..
[info] Test Passed: testProducerQueueSize
[info] Test Starting: testAddAfterQueueClosed
[info] Test Passed: testAddAfterQueueClosed
[info] Test Starting: testBatchSize
[info] Test Passed: testBatchSize
[info] Test Starting: testQueueTimeExpired
[info] Test Passed: testQueueTimeExpired
[info] Test Starting: testSenderThreadShutdown
[info] Test Passed: testSenderThreadShutdown
[info] Test Starting: testCollateEvents
[info] Test Passed: testCollateEvents
[info] Test Starting: testCollateAndSerializeEvents
[info] Test Passed: testCollateAndSerializeEvents
[info] == core-kafka / kafka.producer.AsyncProducerTest ==
[info] 
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] Test Starting: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[error] Test Failed: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
junit.framework.AssertionFailedError: expected:<0> but was:<3>
	at junit.framework.Assert.fail(Assert.java:47)
	at junit.framework.Assert.failNotEquals(Assert.java:277)
	at junit.framework.Assert.assertEquals(Assert.java:64)
	at junit.framework.Assert.assertEquals(Assert.java:195)
	at junit.framework.Assert.assertEquals(Assert.java:201)
	at kafka.integration.AutoOffsetResetTest.testLatestOffsetResetForward(AutoOffsetResetTest.scala:218)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] 
[info] == core-kafka / kafka.integration.PrimitiveApiTest ==
[info] Test Starting: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetch(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduce(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduce(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduceWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testDefaultEncoderProducerAndFetch(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testDefaultEncoderProducerAndFetchWithCompression(kafka.integration.PrimitiveApiTest)
[info] Test Starting: testConsumerNotExistTopic(kafka.integration.PrimitiveApiTest)
[info] Test Passed: testConsumerNotExistTopic(kafka.integration.PrimitiveApiTest)
[info] == core-kafka / kafka.integration.PrimitiveApiTest ==
[info] 
[info] == core-kafka / kafka.javaapi.producer.ProducerTest ==
[info] Test Starting: testSend
[info] Test Passed: testSend
[info] Test Starting: testSendSingleMessage
[info] Test Passed: testSendSingleMessage
[info] Test Starting: testInvalidPartition
[info] Test Passed: testInvalidPartition
[info] Test Starting: testSyncProducerPool
[info] Test Passed: testSyncProducerPool
[info] Test Starting: testAsyncProducerPool
[info] Test Passed: testAsyncProducerPool
[info] Test Starting: testSyncUnavailableProducerException
[info] Test Passed: testSyncUnavailableProducerException
[info] Test Starting: testAsyncUnavailableProducerException
[info] Test Passed: testAsyncUnavailableProducerException
[info] Test Starting: testConfigBrokerPartitionInfoWithPartitioner
[info] Test Passed: testConfigBrokerPartitionInfoWithPartitioner
[info] Test Starting: testConfigBrokerPartitionInfo
[info] Test Passed: testConfigBrokerPartitionInfo
[info] Test Starting: testZKSendToNewTopic
[info] Test Passed: testZKSendToNewTopic
[info] Test Starting: testZKSendWithDeadBroker
[info] Test Passed: testZKSendWithDeadBroker
[info] Test Starting: testPartitionedSendToNewTopic
[info] Test Passed: testPartitionedSendToNewTopic
[info] Test Starting: testPartitionedSendToNewBrokerInExistingTopic
[info] Test Passed: testPartitionedSendToNewBrokerInExistingTopic
[info] Test Starting: testDefaultPartitioner
[info] Test Passed: testDefaultPartitioner
[info] == core-kafka / kafka.javaapi.producer.ProducerTest ==
[info] 
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testFileSize
[info] Test Passed: testFileSize
[info] Test Starting: testIterationOverPartialAndTruncation
[info] Test Passed: testIterationOverPartialAndTruncation
[info] Test Starting: testIterationDoesntChangePosition
[info] Test Passed: testIterationDoesntChangePosition
[info] Test Starting: testRead
[info] Test Passed: testRead
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] 
[info] == core-kafka / kafka.message.CompressionUtilTest ==
[info] Test Starting: testSimpleCompressDecompress
[info] Test Passed: testSimpleCompressDecompress
[info] Test Starting: testComplexCompressDecompress
[info] Test Passed: testComplexCompressDecompress
[info] Test Starting: testSnappyCompressDecompressExplicit
[info] Test Passed: testSnappyCompressDecompressExplicit
[info] == core-kafka / kafka.message.CompressionUtilTest ==
[info] 
[info] == core-kafka / kafka.log.LogCorruptionTest ==
[info] Test Starting: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)
This is good
[info] Test Passed: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)
[info] == core-kafka / kafka.log.LogCorruptionTest ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_dc8c451e
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 140, Failed 2, Errors 0, Passed 138, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.zk.ZKEphemeralTest: Test FAILED
[error] Error running kafka.integration.AutoOffsetResetTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 131 s, completed Aug 25, 2012 6:08:28 AM
[info] 
[info] Total session time: 132 s, completed Aug 25, 2012 6:08:28 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : Kafka-trunk #146

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/146/changes>


Build failed in Jenkins: Kafka-trunk #145

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/145/changes>

Changes:

[joestein] KAFKA-533 changes to NOTICE and LICENSE related to KAFKA-534 removing client libraries from repo

[joestein] KAFKA-534 remove client library directory

------------------------------------------
[...truncated 1357 lines...]
[info] Test Starting: testRead
[info] Test Passed: testRead
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] 
[info] == core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==
[info] Test Starting: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-27 01:40:14,424] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,424] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,424] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[info] Test Passed: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-27 01:40:14,634] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,634] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,634] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[info] Test Passed: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-27 01:40:14,993] WARN Session 0x13a056187c60009 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[info] Test Passed: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-27 01:40:15,411] WARN Session 0x13a056179d9000a for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[info] Test Passed: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-27 01:40:15,701] WARN Session 0x13a056199810004 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-27 01:40:15,719] WARN Session 0x13a056199810005 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-27 01:40:15,734] WARN Session 0x13a056187c60006 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[info] Test Passed: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] == core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==
[info] 
[info] == core-kafka / kafka.server.ServerShutdownTest ==
[info] Test Starting: testCleanShutdown
[2012-09-27 01:40:16,558] WARN Session 0x13a056187c60009 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[info] Test Passed: testCleanShutdown
[info] == core-kafka / kafka.server.ServerShutdownTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs
[info] Test Starting: testBrokerListLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testBrokerListLog4jAppends
[info] Test Starting: testZkConnectLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testZkConnectLog4jAppends
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] Test Starting: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] Test Passed: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] 
[info] == core-kafka / unit.kafka.producer.ProducerMethodsTest ==
[info] Test Starting: producerThrowsNoBrokersException
[info] Test Passed: producerThrowsNoBrokersException
[info] == core-kafka / unit.kafka.producer.ProducerMethodsTest ==
[info] 
[info] == core-kafka / kafka.message.MessageTest ==
[info] Test Starting: testFieldValues
[info] Test Passed: testFieldValues
[info] Test Starting: testChecksum
[info] Test Passed: testChecksum
[info] Test Starting: testEquality
[info] Test Passed: testEquality
[info] Test Starting: testIsHashable
[info] Test Passed: testIsHashable
[info] == core-kafka / kafka.message.MessageTest ==
[info] 
[info] == core-kafka / kafka.zk.ZKLoadBalanceTest ==
[info] Test Starting: testLoadBalance(kafka.zk.ZKLoadBalanceTest)
[info] Test Passed: testLoadBalance(kafka.zk.ZKLoadBalanceTest)
[info] == core-kafka / kafka.zk.ZKLoadBalanceTest ==
[info] 
[info] == core-kafka / kafka.log.LogManagerTest ==
[info] Test Starting: testCreateLog
[info] Test Passed: testCreateLog
[info] Test Starting: testGetLog
[info] Test Passed: testGetLog
[info] Test Starting: testInvalidTopicName
[info] Test Passed: testInvalidTopicName
[info] Test Starting: testCleanupExpiredSegments
[info] Test Passed: testCleanupExpiredSegments
[info] Test Starting: testCleanupSegmentsToMaintainSize
[info] Test Passed: testCleanupSegmentsToMaintainSize
[info] Test Starting: testTimeBasedFlush
[info] Test Passed: testTimeBasedFlush
[info] Test Starting: testConfigurablePartitions
[info] Test Passed: testConfigurablePartitions
[info] == core-kafka / kafka.log.LogManagerTest ==
[info] 
[info] == core-kafka / kafka.producer.SyncProducerTest ==
[info] Test Starting: testReachableServer
[info] Test Passed: testReachableServer
[info] Test Starting: testSingleMessageSizeTooLarge
[info] Test Passed: testSingleMessageSizeTooLarge
[info] Test Starting: testCompressedMessageSizeTooLarge
[info] Test Passed: testCompressedMessageSizeTooLarge
[info] == core-kafka / kafka.producer.SyncProducerTest ==
[info] 
[info] == core-kafka / kafka.consumer.FetcherTest ==
[info] Test Starting: testFetcher(kafka.consumer.FetcherTest)
[info] Test Passed: testFetcher(kafka.consumer.FetcherTest)
[info] == core-kafka / kafka.consumer.FetcherTest ==
[info] 
[info] == core-kafka / kafka.log.SegmentListTest ==
[info] Test Starting: testAppend
[info] Test Passed: testAppend
[info] Test Starting: testTrunc
[info] Test Passed: testTrunc
[info] Test Starting: testTruncBeyondList
[info] Test Passed: testTruncBeyondList
[info] == core-kafka / kafka.log.SegmentListTest ==
[info] 
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] Test Starting: testSwallow
[info] Test Passed: testSwallow
[info] Test Starting: testCircularIterator
[info] Test Passed: testCircularIterator
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] 
[info] == core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testIteratorIsConsistentWithCompression
[info] Test Passed: testIteratorIsConsistentWithCompression
[info] Test Starting: testSizeInBytesWithCompression
[info] Test Passed: testSizeInBytesWithCompression
[info] Test Starting: testValidBytesWithCompression
[info] Test Passed: testValidBytesWithCompression
[info] Test Starting: testEqualsWithCompression
[info] Test Passed: testEqualsWithCompression
[info] == core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] Test Starting: simpleRequest
[info] Test Passed: simpleRequest
[info] Test Starting: tooBigRequestIsRejected
[info] Test Passed: tooBigRequestIsRejected
[info] == core-kafka / kafka.network.SocketServerTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 141, Failed 1, Errors 0, Passed 140, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_a07426d7
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[info] 
[info] == perf / copy-test-resources ==
[info] == perf / copy-test-resources ==
[info] 
[info] == java-examples / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == java-examples / test-compile ==
[info] 
[info] == perf / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == perf / test-compile ==
[info] 
[info] == hadoop producer / copy-resources ==
[info] == hadoop producer / copy-resources ==
[info] 
[info] == hadoop producer / copy-test-resources ==
[info] == hadoop producer / copy-test-resources ==
[info] 
[info] == java-examples / copy-test-resources ==
[info] == java-examples / copy-test-resources ==
[info] 
[info] == hadoop consumer / copy-test-resources ==
[info] == hadoop consumer / copy-test-resources ==
[info] 
[info] == java-examples / copy-resources ==
[info] == java-examples / copy-resources ==
[info] 
[info] == hadoop producer / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == hadoop producer / test-compile ==
[info] 
[info] == hadoop consumer / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == hadoop consumer / test-compile ==
[info] 
[info] == hadoop consumer / copy-resources ==
[info] == hadoop consumer / copy-resources ==
[info] 
[info] == perf / copy-resources ==
[info] == perf / copy-resources ==
[error] Error running kafka.integration.AutoOffsetResetTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 142 s, completed Sep 27, 2012 1:40:33 AM
[info] 
[info] Total session time: 142 s, completed Sep 27, 2012 1:40:33 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Kafka-trunk #144

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/144/changes>

Changes:

[junrao] TopicCount.constructTopicCount isn't thread-safe; patched by Jun Rao; reviewed by Joel Koshy; KAFKA-379

------------------------------------------
[...truncated 420 lines...]
[info] 
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testFileSize
[info] Test Passed: testFileSize
[info] Test Starting: testIterationOverPartialAndTruncation
[info] Test Passed: testIterationOverPartialAndTruncation
[info] Test Starting: testIterationDoesntChangePosition
[info] Test Passed: testIterationDoesntChangePosition
[info] Test Starting: testRead
[info] Test Passed: testRead
[info] == core-kafka / kafka.message.FileMessageSetTest ==
[info] 
[info] == core-kafka / kafka.message.CompressionUtilTest ==
[info] Test Starting: testSimpleCompressDecompress
[info] Test Passed: testSimpleCompressDecompress
[info] Test Starting: testComplexCompressDecompress
[info] Test Passed: testComplexCompressDecompress
[info] Test Starting: testSnappyCompressDecompressExplicit
[info] Test Passed: testSnappyCompressDecompressExplicit
[info] == core-kafka / kafka.message.CompressionUtilTest ==
[info] 
[info] == core-kafka / kafka.log.LogTest ==
[info] Test Starting: testTimeBasedLogRoll
[info] Test Passed: testTimeBasedLogRoll
[info] Test Starting: testSizeBasedLogRoll
[info] Test Passed: testSizeBasedLogRoll
[info] Test Starting: testLoadEmptyLog
[info] Test Passed: testLoadEmptyLog
[info] Test Starting: testLoadInvalidLogsFails
[info] Test Passed: testLoadInvalidLogsFails
[info] Test Starting: testAppendAndRead
[info] Test Passed: testAppendAndRead
[info] Test Starting: testReadOutOfRange
[info] Test Passed: testReadOutOfRange
[info] Test Starting: testLogRolls
[info] Test Passed: testLogRolls
[info] Test Starting: testFindSegment
[info] Test Passed: testFindSegment
[info] Test Starting: testEdgeLogRolls
[info] Test Passed: testEdgeLogRolls
[info] Test Starting: testMessageSizeCheck
[info] Test Passed: testMessageSizeCheck
[info] == core-kafka / kafka.log.LogTest ==
[info] 
[info] == core-kafka / kafka.consumer.TopicFilterTest ==
[info] Test Starting: testWhitelists
[info] Test Passed: testWhitelists
[info] Test Starting: testBlacklists
[info] Test Passed: testBlacklists
[info] == core-kafka / kafka.consumer.TopicFilterTest ==
[info] 
[info] == core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==
[info] Test Starting: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Passed: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] == core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==
[info] 
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==
[info] Test Starting: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] Test Passed: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)
[info] == core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==
[info] 
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testWriteTo
[info] Test Passed: testWriteTo
[info] Test Starting: testSmallFetchSize
[info] Test Passed: testSmallFetchSize
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testIterator
[info] Test Passed: testIterator
[info] == core-kafka / kafka.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] Test Starting: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)
[info] Test Starting: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[info] Test Passed: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)
[info] == core-kafka / kafka.integration.AutoOffsetResetTest ==
[info] 
[info] == core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==
[info] Test Starting: testWrittenEqualsRead
[info] Test Passed: testWrittenEqualsRead
[info] Test Starting: testIteratorIsConsistent
[info] Test Passed: testIteratorIsConsistent
[info] Test Starting: testSizeInBytes
[info] Test Passed: testSizeInBytes
[info] Test Starting: testValidBytes
[info] Test Passed: testValidBytes
[info] Test Starting: testEquals
[info] Test Passed: testEquals
[info] Test Starting: testIteratorIsConsistentWithCompression
[info] Test Passed: testIteratorIsConsistentWithCompression
[info] Test Starting: testSizeInBytesWithCompression
[info] Test Passed: testSizeInBytesWithCompression
[info] Test Starting: testValidBytesWithCompression
[info] Test Passed: testValidBytesWithCompression
[info] Test Starting: testEqualsWithCompression
[info] Test Passed: testEqualsWithCompression
[info] == core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==
[info] 
[info] == core-kafka / kafka.server.ServerShutdownTest ==
[info] Test Starting: testCleanShutdown
[info] Test Passed: testCleanShutdown
[info] == core-kafka / kafka.server.ServerShutdownTest ==
[info] 
[info] == core-kafka / kafka.producer.SyncProducerTest ==
[info] Test Starting: testReachableServer
[info] Test Passed: testReachableServer
[info] Test Starting: testSingleMessageSizeTooLarge
[info] Test Passed: testSingleMessageSizeTooLarge
[info] Test Starting: testCompressedMessageSizeTooLarge
[info] Test Passed: testCompressedMessageSizeTooLarge
[info] == core-kafka / kafka.producer.SyncProducerTest ==
[info] 
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] Test Starting: testEmptyLogs
[info] Test Passed: testEmptyLogs
[info] Test Starting: testGetOffsetsBeforeLatestTime
[info] Test Passed: testGetOffsetsBeforeLatestTime
[info] Test Starting: testEmptyLogsGetOffsets
[info] Test Passed: testEmptyLogsGetOffsets
[info] Test Starting: testGetOffsetsBeforeNow
[info] Test Passed: testGetOffsetsBeforeNow
[info] Test Starting: testGetOffsetsBeforeEarliestTime
[info] Test Passed: testGetOffsetsBeforeEarliestTime
[info] == core-kafka / kafka.log.LogOffsetTest ==
[info] 
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] Test Starting: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] Test Passed: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)
[info] == core-kafka / kafka.zk.ZKEphemeralTest ==
[info] 
[info] == core-kafka / kafka.integration.BackwardsCompatibilityTest ==
[info] Test Starting: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)
[info] Test Passed: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)
[info] == core-kafka / kafka.integration.BackwardsCompatibilityTest ==
[info] 
[info] == core-kafka / kafka.message.MessageTest ==
[info] Test Starting: testFieldValues
[info] Test Passed: testFieldValues
[info] Test Starting: testChecksum
[info] Test Passed: testChecksum
[info] Test Starting: testEquality
[info] Test Passed: testEquality
[info] Test Starting: testIsHashable
[info] Test Passed: testIsHashable
[info] == core-kafka / kafka.message.MessageTest ==
[info] 
[info] == core-kafka / kafka.javaapi.producer.ProducerTest ==
[info] Test Starting: testSend
[info] Test Passed: testSend
[info] Test Starting: testSendSingleMessage
[info] Test Passed: testSendSingleMessage
[info] Test Starting: testInvalidPartition
[info] Test Passed: testInvalidPartition
[info] Test Starting: testSyncProducerPool
[info] Test Passed: testSyncProducerPool
[info] Test Starting: testAsyncProducerPool
[info] Test Passed: testAsyncProducerPool
[info] Test Starting: testSyncUnavailableProducerException
[info] Test Passed: testSyncUnavailableProducerException
[info] Test Starting: testAsyncUnavailableProducerException
[info] Test Passed: testAsyncUnavailableProducerException
[info] Test Starting: testConfigBrokerPartitionInfoWithPartitioner
[info] Test Passed: testConfigBrokerPartitionInfoWithPartitioner
[info] Test Starting: testConfigBrokerPartitionInfo
[info] Test Passed: testConfigBrokerPartitionInfo
[info] Test Starting: testZKSendToNewTopic
[info] Test Passed: testZKSendToNewTopic
[info] Test Starting: testZKSendWithDeadBroker
[info] Test Passed: testZKSendWithDeadBroker
[info] Test Starting: testPartitionedSendToNewTopic
[info] Test Passed: testPartitionedSendToNewTopic
[info] Test Starting: testPartitionedSendToNewBrokerInExistingTopic
[info] Test Passed: testPartitionedSendToNewBrokerInExistingTopic
[info] Test Starting: testDefaultPartitioner
[info] Test Passed: testDefaultPartitioner
[info] == core-kafka / kafka.javaapi.producer.ProducerTest ==
[info] 
[info] == core-kafka / kafka.consumer.FetcherTest ==
[info] Test Starting: testFetcher(kafka.consumer.FetcherTest)
[info] Test Passed: testFetcher(kafka.consumer.FetcherTest)
[info] == core-kafka / kafka.consumer.FetcherTest ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_aa7c472d
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 141, Failed 1, Errors 0, Passed 140, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[info] 
[info] == perf / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == perf / test-compile ==
[info] 
[info] == hadoop consumer / copy-test-resources ==
[info] == hadoop consumer / copy-test-resources ==
[info] 
[info] == hadoop consumer / copy-resources ==
[info] == hadoop consumer / copy-resources ==
[error] Error running kafka.producer.ProducerTest: Test FAILED
[error] Error running compile: javac returned nonzero exit code
[error] Error running compile: javac returned nonzero exit code
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 137 s, completed Sep 18, 2012 6:04:25 AM
[info] 
[info] Total session time: 137 s, completed Sep 18, 2012 6:04:25 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Kafka-trunk #143

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/143/changes>

Changes:

[junrao] Handle topic names with / on Kafka server; patched by Swapnil Ghike; reviewed by Jay Kreps and Jun Rao; kafka-495

------------------------------------------
[...truncated 2261 lines...]
[2012-09-07 04:19:08,269] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,270] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-07 04:19:08,271] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,271] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:08,273] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:08,274] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:08,274] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:08,274] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,276] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,277] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,280] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,283] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,418] WARN Session 0x1399ef3d5c10005 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:08,479] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,479] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,479] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,479] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,480] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,481] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,481] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-07 04:19:08,482] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,483] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:08,484] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:08,485] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:08,485] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:08,486] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,487] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,488] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,489] WARN Session 0x1399ef3ded30002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:08,492] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,497] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,639] WARN Session 0x1399ef3d5c10003 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:08,691] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,691] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,691] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,691] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,691] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,692] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,693] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-07 04:19:08,694] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,694] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:08,696] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:08,697] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:08,697] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:08,697] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,698] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,699] INFO Created log for 'test'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,723] WARN Session 0x1399ef3aa650007 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,100] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,100] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,100] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,100] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,100] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,102] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,103] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-07 04:19:09,104] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,104] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,106] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,106] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,107] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,107] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,107] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,108] INFO Created log for 'test'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,253] WARN Session 0x1399ef3d5c10002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,509] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,510] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,510] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,510] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,510] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,511] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,512] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-07 04:19:09,512] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,513] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,514] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,515] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,515] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,516] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,517] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,518] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,547] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,551] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,641] WARN Session 0x1399ef3d5c10004 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,720] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,720] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,720] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,720] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,720] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,722] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,722] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)
[info] Test Starting: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[2012-09-07 04:19:09,724] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,724] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,726] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,727] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,727] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,728] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,729] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,730] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,767] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,796] WARN Session 0x1399ef3d5c10005 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,818] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,884] WARN Session 0x1399ef3d5c10003 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
	at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,933] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,933] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,934] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,933] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,934] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,935] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,935] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)
[info] == core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs
[2012-09-07 04:19:09,946] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,946] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,947] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-09-07 04:19:09,958] INFO Awaiting connections on port 46964 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,959] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,959] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-09-07 04:19:09,967] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:67.195.138.60-1346991549959,host:67.195.138.60,port:46964 (kafka.server.KafkaZooKeeper:61)
[2012-09-07 04:19:09,967] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,967] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,968] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,968] INFO starting log cleaner every 300000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,970] INFO Awaiting connections on port 40868 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,971] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,971] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,971] INFO Kafka server started. (kafka.server.KafkaServer:61)
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (kafka.producer.ProducerPool).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs
[info] Test Starting: testBrokerListLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testBrokerListLog4jAppends
[info] Test Starting: testZkConnectLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testZkConnectLog4jAppends
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / kafka.integration.LazyInitProducerTest ==
[info] Test Starting: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testMultiProduce(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testMultiProduce(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testProduceAndFetch(kafka.integration.LazyInitProducerTest)
[info] Test Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[info] Test Passed: testMultiProduceResend(kafka.integration.LazyInitProducerTest)
[info] == core-kafka / kafka.integration.LazyInitProducerTest ==
[info] 
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] Test Starting: testSwallow
[info] Test Passed: testSwallow
[info] Test Starting: testCircularIterator
[info] Test Passed: testCircularIterator
[info] == core-kafka / kafka.utils.UtilsTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 141, Failed 1, Errors 0, Passed 140, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_fb4e25f5
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[info] 
[info] == hadoop consumer / copy-test-resources ==
[info] == hadoop consumer / copy-test-resources ==
[info] 
[info] == java-examples / copy-resources ==
[info] == java-examples / copy-resources ==
[info] 
[info] == java-examples / copy-test-resources ==
[info] == java-examples / copy-test-resources ==
[info] 
[info] == hadoop consumer / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == hadoop consumer / test-compile ==
[info] 
[info] == hadoop producer / copy-resources ==
[info] == hadoop producer / copy-resources ==
[info] 
[info] == hadoop consumer / copy-resources ==
[info] == hadoop consumer / copy-resources ==
[info] 
[info] == java-examples / test-compile ==
[info]   Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[info]   Post-analysis: 0 classes.
[info] == java-examples / test-compile ==
[error] Error running kafka.integration.AutoOffsetResetTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 121 s, completed Sep 7, 2012 4:19:17 AM
[info] 
[info] Total session time: 121 s, completed Sep 7, 2012 4:19:17 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: Kafka-trunk #142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/142/changes>

Changes:

[junrao] Message size not checked at the server (patch v3); patched by Swapnil Ghike; reviewed by Jun Rao; KAFKA-469

------------------------------------------
[...truncated 2359 lines...]
[2012-08-31 05:21:57,528] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:localhost-1346408517439,host:localhost,port:52176 (kafka.server.KafkaZooKeeper:61)
[2012-08-31 05:21:57,532] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-31 05:21:57,536] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-31 05:21:57,542] INFO Connected to localhost:52176 for producing (kafka.producer.SyncProducer:61)
[2012-08-31 05:21:57,549] INFO Created log for 'test'-0 (kafka.log.LogManager:61)
[2012-08-31 05:21:57,551] INFO Begin registering broker topic /brokers/topics/test/0 with 1 partitions (kafka.server.KafkaZooKeeper:61)
[2012-08-31 05:21:57,644] INFO End registering broker topic /brokers/topics/test/0 (kafka.server.KafkaZooKeeper:61)
This is good
[2012-08-31 05:21:58,306] INFO group1_consumer1 Connecting to zookeeper instance at 127.0.0.1:2182 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,328] INFO group1_consumer1 starting auto committer every 1000 ms (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,333] INFO group1_consumer1 begin registering consumer group1_consumer1 in ZK (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,394] INFO group1_consumer1 end registering consumer group1_consumer1 in ZK (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,400] INFO group1_consumer1 starting watcher executor thread for consumer group1_consumer1 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,406] INFO group1_consumer1 begin rebalancing consumer group1_consumer1 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,457] INFO Cleared all relevant queues for this fetcher (kafka.consumer.Fetcher:61)
[2012-08-31 05:21:58,462] INFO Clearing the current data chunk for this consumer iterator (kafka.consumer.ConsumerIterator:61)
[2012-08-31 05:21:58,468] INFO Cleared the data chunks in all the consumer message iterators (kafka.consumer.Fetcher:61)
[2012-08-31 05:21:58,472] INFO group1_consumer1 Committing all offsets after clearing the fetcher queues (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,476] INFO group1_consumer1 Releasing partition ownership (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,481] INFO group1_consumer1 Consumer group1_consumer1 rebalancing the following partitions: List(0-0) for topic test with consumers: List(group1_consumer1-0) (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,485] INFO group1_consumer1 group1_consumer1-0 attempting to claim partition 0-0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,497] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-08-31 05:21:58,552] INFO group1_consumer1 group1_consumer1-0 successfully owned partition 0-0 for topic test (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,556] INFO group1_consumer1 Updating the cache (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,566] INFO group1_consumer1 Consumer group1_consumer1 selected partitions : test:0-0: fetched offset = 0: consumed offset = 0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,571] INFO group1_consumer1 end rebalancing consumer group1_consumer1 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,579] INFO group1_consumer1 ZKConsumerConnector shutting down (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,582] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-08-31 05:21:58,583] INFO force shutdown scheduler Kafka-consumer-autocommit- (kafka.utils.KafkaScheduler:61)
[2012-08-31 05:21:58,636] INFO group1_consumer1 ZKConsumerConnector shut down completed (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,640] INFO Disconnecting from localhost:52176 (kafka.producer.SyncProducer:61)
[2012-08-31 05:21:58,644] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-08-31 05:21:58,646] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-08-31 05:21:58,648] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-08-31 05:21:58,651] INFO ZK expired; release old list of broker partitions for topics  (kafka.producer.ZKBrokerPartitionInfo:61)
[2012-08-31 05:21:58,650] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-08-31 05:21:58,658] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-08-31 05:21:58,691] INFO Closing zookeeper client... (kafka.server.KafkaZooKeeper:61)
[2012-08-31 05:21:58,692] INFO zkActor stopped (kafka.log.LogManager:61)
[2012-08-31 05:21:58,711] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[info] Test Passed: testMessageSizeTooLarge(kafka.log.LogCorruptionTest)
[info] == core-kafka / kafka.log.LogCorruptionTest ==
[info] 
[info] == core-kafka / kafka.zk.ZKLoadBalanceTest ==
[info] Test Starting: testLoadBalance(kafka.zk.ZKLoadBalanceTest)
[2012-08-31 05:21:58,926] INFO group1_consumer1 Connecting to zookeeper instance at 127.0.0.1:2182 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,960] INFO group1_consumer1 starting auto committer every 1000 ms (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:58,965] INFO group1_consumer1 begin registering consumer group1_consumer1 in ZK (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,053] INFO group1_consumer1 end registering consumer group1_consumer1 in ZK (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,058] INFO group1_consumer1 starting watcher executor thread for consumer group1_consumer1 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,065] INFO group1_consumer1 begin rebalancing consumer group1_consumer1 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,113] INFO group1_consumer1 Releasing partition ownership (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,118] INFO group1_consumer1 Consumer group1_consumer1 rebalancing the following partitions: List(400-0) for topic topic1 with consumers: List(group1_consumer1-0) (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,122] INFO group1_consumer1 group1_consumer1-0 attempting to claim partition 400-0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:21:59,402] INFO group1_consumer1 stopping watcher executor thread for consumer group1_consumer1 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,392] ERROR group1_consumer1 error in earliestOrLatestOffset()  (kafka.consumer.ZookeeperConsumerConnector:89)
java.net.ConnectException: Connection timed out
	at sun.nio.ch.Net.connect0(Native Method)
	at sun.nio.ch.Net.connect(Net.java:364)
	at sun.nio.ch.Net.connect(Net.java:356)
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:623)
	at kafka.consumer.SimpleConsumer.connect(SimpleConsumer.scala:49)
	at kafka.consumer.SimpleConsumer.getOrMakeConnection(SimpleConsumer.scala:186)
	at kafka.consumer.SimpleConsumer.getOffsetsBefore(SimpleConsumer.scala:149)
	at kafka.consumer.ZookeeperConsumerConnector.kafka$consumer$ZookeeperConsumerConnector$$earliestOrLatestOffset(ZookeeperConsumerConnector.scala:329)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener.kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$addPartitionTopicInfo(ZookeeperConsumerConnector.scala:637)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance$1$$anonfun$apply$11$$anonfun$apply$13.apply(ZookeeperConsumerConnector.scala:523)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance$1$$anonfun$apply$11$$anonfun$apply$13.apply(ZookeeperConsumerConnector.scala:520)
	at scala.collection.immutable.Range$ByOne$class.foreach(Range.scala:285)
	at scala.collection.immutable.Range$$anon$2.foreach(Range.scala:265)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance$1$$anonfun$apply$11.apply(ZookeeperConsumerConnector.scala:520)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance$1$$anonfun$apply$11.apply(ZookeeperConsumerConnector.scala:507)
	at scala.collection.mutable.HashSet.foreach(HashSet.scala:61)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance$1.apply(ZookeeperConsumerConnector.scala:507)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance$1.apply(ZookeeperConsumerConnector.scala:494)
	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:80)
	at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:80)
	at scala.collection.Iterator$class.foreach(Iterator.scala:631)
	at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:161)
	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:194)
	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
	at scala.collection.mutable.HashMap.foreach(HashMap.scala:80)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener.kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$rebalance(ZookeeperConsumerConnector.scala:494)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$syncedRebalance$1.apply$mcVI$sp(ZookeeperConsumerConnector.scala:449)
	at scala.collection.immutable.Range$ByOne$class.foreach$mVc$sp(Range.scala:282)
	at scala.collection.immutable.Range$$anon$2.foreach$mVc$sp(Range.scala:265)
	at kafka.consumer.ZookeeperConsumerConnector$ZKRebalancerListener.syncedRebalance(ZookeeperConsumerConnector.scala:444)
	at kafka.consumer.ZookeeperConsumerConnector.kafka$consumer$ZookeeperConsumerConnector$$reinitializeConsumer(ZookeeperConsumerConnector.scala:733)
	at kafka.consumer.ZookeeperConsumerConnector.consume(ZookeeperConsumerConnector.scala:207)
	at kafka.consumer.ZookeeperConsumerConnector.createMessageStreams(ZookeeperConsumerConnector.scala:137)
	at kafka.zk.ZKLoadBalanceTest.testLoadBalance(ZKLoadBalanceTest.scala:47)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[2012-08-31 05:23:02,462] INFO group1_consumer1 group1_consumer1-0 successfully owned partition 400-0 for topic topic1 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,467] INFO group1_consumer1 Updating the cache (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,472] INFO group1_consumer1 Consumer group1_consumer1 selected partitions : topic1:400-0: fetched offset = -1: consumed offset = -1 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,476] INFO group1_consumer1 end rebalancing consumer group1_consumer1 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,489] INFO group1_consumer2 Connecting to zookeeper instance at 127.0.0.1:2182 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,514] INFO group1_consumer2 starting auto committer every 1000 ms (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,518] INFO group1_consumer2 begin registering consumer group1_consumer2 in ZK (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,530] INFO group1_consumer2 end registering consumer group1_consumer2 in ZK (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,533] INFO group1_consumer1 begin rebalancing consumer group1_consumer1 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,536] INFO group1_consumer2 starting watcher executor thread for consumer group1_consumer2 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,542] INFO group1_consumer2 begin rebalancing consumer group1_consumer2 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,596] INFO group1_consumer1 Releasing partition ownership (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,604] INFO group1_consumer1 Consumer group1_consumer1 rebalancing the following partitions: List(400-0) for topic topic1 with consumers: List(group1_consumer1-0, group1_consumer2-0) (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,606] INFO group1_consumer2 Releasing partition ownership (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,607] INFO group1_consumer1 group1_consumer1-0 attempting to claim partition 400-0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,611] INFO group1_consumer2 Consumer group1_consumer2 rebalancing the following partitions: List(400-0) for topic topic1 with consumers: List(group1_consumer1-0, group1_consumer2-0) (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,618] WARN group1_consumer2 No broker partitions consumed by consumer thread group1_consumer2-0 for topic topic1 (kafka.consumer.ZookeeperConsumerConnector:73)
[2012-08-31 05:23:02,623] INFO group1_consumer2 Updating the cache (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,627] INFO group1_consumer2 Consumer group1_consumer2 selected partitions :  (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,631] INFO group1_consumer2 end rebalancing consumer group1_consumer2 try #0 (kafka.consumer.ZookeeperConsumerConnector:61)
[2012-08-31 05:23:02,852] WARN Exception causing close of session 0x1397c33946d0006 due to java.nio.channels.CancelledKeyException (org.apache.zookeeper.server.NIOServerCnxn:623)
[2012-08-31 05:23:02,852] ERROR Unexpected Exception:  (org.apache.zookeeper.server.NIOServerCnxn:445)
java.nio.channels.CancelledKeyException
	at sun.nio.ch.SelectionKeyImpl.ensureValid(SelectionKeyImpl.java:73)
	at sun.nio.ch.SelectionKeyImpl.interestOps(SelectionKeyImpl.java:77)
	at org.apache.zookeeper.server.NIOServerCnxn.sendBuffer(NIOServerCnxn.java:418)
	at org.apache.zookeeper.server.NIOServerCnxn.sendResponse(NIOServerCnxn.java:1509)
	at org.apache.zookeeper.server.FinalRequestProcessor.processRequest(FinalRequestProcessor.java:367)
	at org.apache.zookeeper.server.SyncRequestProcessor.run(SyncRequestProcessor.java:135)
[error] Test Failed: testLoadBalance(kafka.zk.ZKLoadBalanceTest)
junit.framework.AssertionFailedError: expected:<1> but was:<0>
	at junit.framework.Assert.fail(Assert.java:47)
	at junit.framework.Assert.failNotEquals(Assert.java:277)
	at junit.framework.Assert.assertEquals(Assert.java:64)
	at junit.framework.Assert.assertEquals(Assert.java:195)
	at junit.framework.Assert.assertEquals(Assert.java:201)
	at kafka.zk.ZKLoadBalanceTest.checkSetEqual(ZKLoadBalanceTest.scala:121)
	at kafka.zk.ZKLoadBalanceTest.testLoadBalance(ZKLoadBalanceTest.scala:67)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at junit.framework.TestCase.runTest(TestCase.java:164)
	at junit.framework.TestCase.runBare(TestCase.java:130)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:120)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at junit.framework.TestSuite.runTest(TestSuite.java:228)
	at junit.framework.TestSuite.run(TestSuite.java:223)
	at org.scalatest.junit.JUnit3Suite.run(JUnit3Suite.scala:309)
	at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:40)
	at sbt.TestRunner.run(TestFramework.scala:53)
	at sbt.TestRunner.runTest$1(TestFramework.scala:67)
	at sbt.TestRunner.run(TestFramework.scala:76)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11.runTest$2(TestFramework.scala:194)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.TestFramework$$anonfun$10$$anonfun$apply$11$$anonfun$apply$12.apply(TestFramework.scala:205)
	at sbt.NamedTestTask.run(TestFramework.scala:92)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.ScalaProject$$anonfun$sbt$ScalaProject$$toTask$1.apply(ScalaProject.scala:193)
	at sbt.TaskManager$Task.invoke(TaskManager.scala:62)
	at sbt.impl.RunTask.doRun$1(RunTask.scala:77)
	at sbt.impl.RunTask.runTask(RunTask.scala:85)
	at sbt.impl.RunTask.sbt$impl$RunTask$$runIfNotRoot(RunTask.scala:60)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.impl.RunTask$$anonfun$runTasksExceptRoot$2.apply(RunTask.scala:48)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Distributor$Run$Worker$$anonfun$2.apply(ParallelRunner.scala:131)
	at sbt.Control$.trapUnit(Control.scala:19)
	at sbt.Distributor$Run$Worker.run(ParallelRunner.scala:131)
[info] == core-kafka / kafka.zk.ZKLoadBalanceTest ==
[info] 
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] Test Starting: testKafkaLog4jConfigs
[2012-08-31 05:23:02,913] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-31 05:23:02,918] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-08-31 05:23:02,922] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-08-31 05:23:03,024] INFO Awaiting connections on port 50863 (kafka.network.Acceptor:130)
[2012-08-31 05:23:03,028] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-08-31 05:23:03,031] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-08-31 05:23:03,078] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:127.0.1.1-1346408583035,host:127.0.1.1,port:50863 (kafka.server.KafkaZooKeeper:61)
[2012-08-31 05:23:03,082] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-31 05:23:03,086] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-08-31 05:23:03,091] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-08-31 05:23:03,096] INFO starting log cleaner every 300000 ms (kafka.log.LogManager:61)
[2012-08-31 05:23:03,142] INFO Awaiting connections on port 33400 (kafka.network.Acceptor:130)
[2012-08-31 05:23:03,146] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-08-31 05:23:03,150] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-08-31 05:23:03,154] INFO Kafka server started. (kafka.server.KafkaServer:61)
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (kafka.producer.ProducerPool).
log4j:WARN Please initialize the log4j system properly.
[info] Test Passed: testKafkaLog4jConfigs
[info] Test Starting: testBrokerListLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testBrokerListLog4jAppends
[info] Test Starting: testZkConnectLog4jAppends
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[info] Test Passed: testZkConnectLog4jAppends
[info] == core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==
[info] 
[info] == core-kafka / test-finish ==
[error] Failed: : Total 140, Failed 2, Errors 0, Passed 138, Skipped 0
[info] == core-kafka / test-finish ==
[info] 
[info] == core-kafka / Test cleanup 1 ==
[info] Deleting directory /tmp/sbt_3154007d
[info] == core-kafka / Test cleanup 1 ==
[info] 
[info] == core-kafka / test-cleanup ==
[info] == core-kafka / test-cleanup ==
[error] Error running kafka.message.CompressionUtilTest: Test FAILED
[error] Error running kafka.zk.ZKLoadBalanceTest: Test FAILED
[error] Error running test: One or more subtasks failed
[info] 
[info] Total time: 1034 s, completed Aug 31, 2012 5:23:08 AM
[info] 
[info] Total session time: 1036 s, completed Aug 31, 2012 5:23:08 AM
[error] Error during build.
Build step 'Execute shell' marked build as failure