You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kafka.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2012/09/07 06:19:17 UTC
Build failed in Jenkins: Kafka-trunk #143
See <https://builds.apache.org/job/Kafka-trunk/143/changes>
Changes:
[junrao] Handle topic names with / on Kafka server; patched by Swapnil Ghike; reviewed by Jay Kreps and Jun Rao; kafka-495
------------------------------------------
[...truncated 2261 lines...]
[2012-09-07 04:19:08,269] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,270] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-07 04:19:08,271] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,271] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:08,273] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:08,274] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:08,274] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:08,274] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,276] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,277] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,280] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,283] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,418] WARN Session 0x1399ef3d5c10005 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:08,479] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,479] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,479] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,479] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,480] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,481] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,481] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-07 04:19:08,482] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,483] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:08,484] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:08,485] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:08,485] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:08,486] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,487] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,488] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,489] WARN Session 0x1399ef3ded30002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:08,492] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,497] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,639] WARN Session 0x1399ef3d5c10003 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:08,691] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,691] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,691] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,691] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:08,691] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,692] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:08,693] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-07 04:19:08,694] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,694] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:08,696] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:08,697] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:08,697] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:08,697] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:08,698] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:08,699] INFO Created log for 'test'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:08,723] WARN Session 0x1399ef3aa650007 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,100] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,100] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,100] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,100] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,100] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,102] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,103] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-07 04:19:09,104] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,104] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,106] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,106] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,107] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,107] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,107] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,108] INFO Created log for 'test'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,253] WARN Session 0x1399ef3d5c10002 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,509] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,510] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,510] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,510] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,510] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,511] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,512] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-07 04:19:09,512] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,513] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,514] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,515] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,515] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,516] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,517] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,518] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,547] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,551] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,641] WARN Session 0x1399ef3d5c10004 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,720] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,720] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,720] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,720] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,720] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,722] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,722] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-07 04:19:09,724] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,724] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,726] INFO Awaiting connections on port 9999 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,727] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,727] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,728] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,729] INFO Connected to localhost:9999 for producing (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,730] INFO Created log for 'test1'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,767] INFO Created log for 'test2'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,796] WARN Session 0x1399ef3d5c10005 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,818] INFO Created log for 'test3'-0 (kafka.log.LogManager:61)
[2012-09-07 04:19:09,884] WARN Session 0x1399ef3d5c10003 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-07 04:19:09,933] INFO Disconnecting from localhost:9999 (kafka.producer.SyncProducer:61)
[2012-09-07 04:19:09,933] INFO Shutting down Kafka server (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,934] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,933] INFO Closing socket connection to /127.0.0.1. (kafka.network.Processor:223)
[2012-09-07 04:19:09,934] INFO shutdown scheduler kafka-logcleaner- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,935] INFO shutdown scheduler kafka-logflusher- (kafka.utils.KafkaScheduler:61)
[2012-09-07 04:19:09,935] INFO Kafka server shut down completed (kafka.server.KafkaServer:61)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testKafkaLog4jConfigs[0m
[2012-09-07 04:19:09,946] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,946] INFO starting log cleaner every 600000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,947] INFO connecting to ZK: 127.0.0.1:2182 (kafka.server.KafkaZooKeeper:61)
[2012-09-07 04:19:09,958] INFO Awaiting connections on port 46964 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,959] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,959] INFO Registering broker /brokers/ids/0 (kafka.server.KafkaZooKeeper:61)
[2012-09-07 04:19:09,967] INFO Registering broker /brokers/ids/0 succeeded with id:0,creatorId:67.195.138.60-1346991549959,host:67.195.138.60,port:46964 (kafka.server.KafkaZooKeeper:61)
[2012-09-07 04:19:09,967] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,967] INFO Kafka server started. (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,968] INFO Starting Kafka server... (kafka.server.KafkaServer:61)
[2012-09-07 04:19:09,968] INFO starting log cleaner every 300000 ms (kafka.log.LogManager:61)
[2012-09-07 04:19:09,970] INFO Awaiting connections on port 40868 (kafka.network.Acceptor:130)
[2012-09-07 04:19:09,971] INFO Will not load MX4J, mx4j-tools.jar is not in the classpath (kafka.utils.Mx4jLoader$:61)
[2012-09-07 04:19:09,971] INFO Starting log flusher every 3000 ms with the following overrides Map() (kafka.log.LogManager:61)
[2012-09-07 04:19:09,971] INFO Kafka server started. (kafka.server.KafkaServer:61)
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (kafka.producer.ProducerPool).
log4j:WARN Please initialize the log4j system properly.
[0m[[0minfo[0m] [0mTest Passed: testKafkaLog4jConfigs[0m
[0m[[0minfo[0m] [0mTest Starting: testBrokerListLog4jAppends[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[0m[[0minfo[0m] [0mTest Passed: testBrokerListLog4jAppends[0m
[0m[[0minfo[0m] [0mTest Starting: testZkConnectLog4jAppends[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[0m[[0minfo[0m] [0mTest Passed: testZkConnectLog4jAppends[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.LazyInitProducerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetch(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduce(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduce(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetch(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetch(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduceResend(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduceResend(kafka.integration.LazyInitProducerTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.LazyInitProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.utils.UtilsTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testSwallow[0m
[0m[[0minfo[0m] [0mTest Passed: testSwallow[0m
[0m[[0minfo[0m] [0mTest Starting: testCircularIterator[0m
[0m[[0minfo[0m] [0mTest Passed: testCircularIterator[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.utils.UtilsTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[31merror[0m] [0mFailed: : Total 141, Failed 1, Errors 0, Passed 140, Skipped 0[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [0mDeleting directory /tmp/sbt_fb4e25f5[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== java-examples / copy-resources ==[0m
[0m[[0minfo[0m] [34m== java-examples / copy-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== java-examples / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== java-examples / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== hadoop consumer / test-compile ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop producer / copy-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop producer / copy-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== java-examples / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== java-examples / test-compile ==[0m
[0m[[31merror[0m] [0mError running kafka.integration.AutoOffsetResetTest: Test FAILED[0m
[0m[[31merror[0m] [0mError running test: One or more subtasks failed[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal time: 121 s, completed Sep 7, 2012 4:19:17 AM[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal session time: 121 s, completed Sep 7, 2012 4:19:17 AM[0m
[0m[[31merror[0m] [0mError during build.[0m
Build step 'Execute shell' marked build as failure
Jenkins build is back to normal : Kafka-trunk #146
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/146/changes>
Build failed in Jenkins: Kafka-trunk #145
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/145/changes>
Changes:
[joestein] KAFKA-533 changes to NOTICE and LICENSE related to KAFKA-534 removing client libraries from repo
[joestein] KAFKA-534 remove client library directory
------------------------------------------
[...truncated 1357 lines...]
[0m[[0minfo[0m] [0mTest Starting: testRead[0m
[0m[[0minfo[0m] [0mTest Passed: testRead[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.FileMessageSetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-27 01:40:14,424] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,424] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,424] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-27 01:40:14,634] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,634] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[2012-09-27 01:40:14,634] WARN Wrong partition -1 valid partitions (0,0) (kafka.log.LogManager:73)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-27 01:40:14,993] WARN Session 0x13a056187c60009 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[0m[[0minfo[0m] [0mTest Passed: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-27 01:40:15,411] WARN Session 0x13a056179d9000a for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[2012-09-27 01:40:15,701] WARN Session 0x13a056199810004 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-27 01:40:15,719] WARN Session 0x13a056199810005 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[2012-09-27 01:40:15,734] WARN Session 0x13a056187c60006 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.server.ServerShutdownTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testCleanShutdown[0m
[2012-09-27 01:40:16,558] WARN Session 0x13a056187c60009 for server null, unexpected error, closing socket connection and attempting reconnect (org.apache.zookeeper.ClientCnxn:1188)
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1146)
[0m[[0minfo[0m] [0mTest Passed: testCleanShutdown[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.server.ServerShutdownTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testKafkaLog4jConfigs[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkEventThread).
log4j:WARN Please initialize the log4j system properly.
[0m[[0minfo[0m] [0mTest Passed: testKafkaLog4jConfigs[0m
[0m[[0minfo[0m] [0mTest Starting: testBrokerListLog4jAppends[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[0m[[0minfo[0m] [0mTest Passed: testBrokerListLog4jAppends[0m
[0m[[0minfo[0m] [0mTest Starting: testZkConnectLog4jAppends[0m
log4j:WARN Using default encoder - kafka.serializer.StringEncoder
[0m[[0minfo[0m] [0mTest Passed: testZkConnectLog4jAppends[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log4j.KafkaLog4jAppenderTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.zk.ZKEphemeralTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.zk.ZKEphemeralTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / unit.kafka.producer.ProducerMethodsTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: producerThrowsNoBrokersException[0m
[0m[[0minfo[0m] [0mTest Passed: producerThrowsNoBrokersException[0m
[0m[[0minfo[0m] [34m== core-kafka / unit.kafka.producer.ProducerMethodsTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.MessageTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testFieldValues[0m
[0m[[0minfo[0m] [0mTest Passed: testFieldValues[0m
[0m[[0minfo[0m] [0mTest Starting: testChecksum[0m
[0m[[0minfo[0m] [0mTest Passed: testChecksum[0m
[0m[[0minfo[0m] [0mTest Starting: testEquality[0m
[0m[[0minfo[0m] [0mTest Passed: testEquality[0m
[0m[[0minfo[0m] [0mTest Starting: testIsHashable[0m
[0m[[0minfo[0m] [0mTest Passed: testIsHashable[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.MessageTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.zk.ZKLoadBalanceTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testLoadBalance(kafka.zk.ZKLoadBalanceTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testLoadBalance(kafka.zk.ZKLoadBalanceTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.zk.ZKLoadBalanceTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogManagerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testCreateLog[0m
[0m[[0minfo[0m] [0mTest Passed: testCreateLog[0m
[0m[[0minfo[0m] [0mTest Starting: testGetLog[0m
[0m[[0minfo[0m] [0mTest Passed: testGetLog[0m
[0m[[0minfo[0m] [0mTest Starting: testInvalidTopicName[0m
[0m[[0minfo[0m] [0mTest Passed: testInvalidTopicName[0m
[0m[[0minfo[0m] [0mTest Starting: testCleanupExpiredSegments[0m
[0m[[0minfo[0m] [0mTest Passed: testCleanupExpiredSegments[0m
[0m[[0minfo[0m] [0mTest Starting: testCleanupSegmentsToMaintainSize[0m
[0m[[0minfo[0m] [0mTest Passed: testCleanupSegmentsToMaintainSize[0m
[0m[[0minfo[0m] [0mTest Starting: testTimeBasedFlush[0m
[0m[[0minfo[0m] [0mTest Passed: testTimeBasedFlush[0m
[0m[[0minfo[0m] [0mTest Starting: testConfigurablePartitions[0m
[0m[[0minfo[0m] [0mTest Passed: testConfigurablePartitions[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogManagerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.SyncProducerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testReachableServer[0m
[0m[[0minfo[0m] [0mTest Passed: testReachableServer[0m
[0m[[0minfo[0m] [0mTest Starting: testSingleMessageSizeTooLarge[0m
[0m[[0minfo[0m] [0mTest Passed: testSingleMessageSizeTooLarge[0m
[0m[[0minfo[0m] [0mTest Starting: testCompressedMessageSizeTooLarge[0m
[0m[[0minfo[0m] [0mTest Passed: testCompressedMessageSizeTooLarge[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.SyncProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.consumer.FetcherTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testFetcher(kafka.consumer.FetcherTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testFetcher(kafka.consumer.FetcherTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.consumer.FetcherTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.SegmentListTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testAppend[0m
[0m[[0minfo[0m] [0mTest Passed: testAppend[0m
[0m[[0minfo[0m] [0mTest Starting: testTrunc[0m
[0m[[0minfo[0m] [0mTest Passed: testTrunc[0m
[0m[[0minfo[0m] [0mTest Starting: testTruncBeyondList[0m
[0m[[0minfo[0m] [0mTest Passed: testTruncBeyondList[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.SegmentListTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.utils.UtilsTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testSwallow[0m
[0m[[0minfo[0m] [0mTest Passed: testSwallow[0m
[0m[[0minfo[0m] [0mTest Starting: testCircularIterator[0m
[0m[[0minfo[0m] [0mTest Passed: testCircularIterator[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.utils.UtilsTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Passed: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testValidBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testValidBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testEquals[0m
[0m[[0minfo[0m] [0mTest Passed: testEquals[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistentWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistentWithCompression[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Starting: testValidBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testValidBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Starting: testEqualsWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testEqualsWithCompression[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.network.SocketServerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: simpleRequest[0m
[0m[[0minfo[0m] [0mTest Passed: simpleRequest[0m
[0m[[0minfo[0m] [0mTest Starting: tooBigRequestIsRejected[0m
[0m[[0minfo[0m] [0mTest Passed: tooBigRequestIsRejected[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.network.SocketServerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[31merror[0m] [0mFailed: : Total 141, Failed 1, Errors 0, Passed 140, Skipped 0[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [0mDeleting directory /tmp/sbt_a07426d7[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== perf / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== perf / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== java-examples / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== java-examples / test-compile ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== perf / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== perf / test-compile ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop producer / copy-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop producer / copy-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop producer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop producer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== java-examples / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== java-examples / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== java-examples / copy-resources ==[0m
[0m[[0minfo[0m] [34m== java-examples / copy-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop producer / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== hadoop producer / test-compile ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== hadoop consumer / test-compile ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== perf / copy-resources ==[0m
[0m[[0minfo[0m] [34m== perf / copy-resources ==[0m
[0m[[31merror[0m] [0mError running kafka.integration.AutoOffsetResetTest: Test FAILED[0m
[0m[[31merror[0m] [0mError running test: One or more subtasks failed[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal time: 142 s, completed Sep 27, 2012 1:40:33 AM[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal session time: 142 s, completed Sep 27, 2012 1:40:33 AM[0m
[0m[[31merror[0m] [0mError during build.[0m
Build step 'Execute shell' marked build as failure
Build failed in Jenkins: Kafka-trunk #144
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Kafka-trunk/144/changes>
Changes:
[junrao] TopicCount.constructTopicCount isn't thread-safe; patched by Jun Rao; reviewed by Joel Koshy; KAFKA-379
------------------------------------------
[...truncated 420 lines...]
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.FileMessageSetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Passed: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testWriteTo[0m
[0m[[0minfo[0m] [0mTest Passed: testWriteTo[0m
[0m[[0minfo[0m] [0mTest Starting: testFileSize[0m
[0m[[0minfo[0m] [0mTest Passed: testFileSize[0m
[0m[[0minfo[0m] [0mTest Starting: testIterationOverPartialAndTruncation[0m
[0m[[0minfo[0m] [0mTest Passed: testIterationOverPartialAndTruncation[0m
[0m[[0minfo[0m] [0mTest Starting: testIterationDoesntChangePosition[0m
[0m[[0minfo[0m] [0mTest Passed: testIterationDoesntChangePosition[0m
[0m[[0minfo[0m] [0mTest Starting: testRead[0m
[0m[[0minfo[0m] [0mTest Passed: testRead[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.FileMessageSetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.CompressionUtilTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testSimpleCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Passed: testSimpleCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Starting: testComplexCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Passed: testComplexCompressDecompress[0m
[0m[[0minfo[0m] [0mTest Starting: testSnappyCompressDecompressExplicit[0m
[0m[[0minfo[0m] [0mTest Passed: testSnappyCompressDecompressExplicit[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.CompressionUtilTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testTimeBasedLogRoll[0m
[0m[[0minfo[0m] [0mTest Passed: testTimeBasedLogRoll[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeBasedLogRoll[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeBasedLogRoll[0m
[0m[[0minfo[0m] [0mTest Starting: testLoadEmptyLog[0m
[0m[[0minfo[0m] [0mTest Passed: testLoadEmptyLog[0m
[0m[[0minfo[0m] [0mTest Starting: testLoadInvalidLogsFails[0m
[0m[[0minfo[0m] [0mTest Passed: testLoadInvalidLogsFails[0m
[0m[[0minfo[0m] [0mTest Starting: testAppendAndRead[0m
[0m[[0minfo[0m] [0mTest Passed: testAppendAndRead[0m
[0m[[0minfo[0m] [0mTest Starting: testReadOutOfRange[0m
[0m[[0minfo[0m] [0mTest Passed: testReadOutOfRange[0m
[0m[[0minfo[0m] [0mTest Starting: testLogRolls[0m
[0m[[0minfo[0m] [0mTest Passed: testLogRolls[0m
[0m[[0minfo[0m] [0mTest Starting: testFindSegment[0m
[0m[[0minfo[0m] [0mTest Passed: testFindSegment[0m
[0m[[0minfo[0m] [0mTest Starting: testEdgeLogRolls[0m
[0m[[0minfo[0m] [0mTest Passed: testEdgeLogRolls[0m
[0m[[0minfo[0m] [0mTest Starting: testMessageSizeCheck[0m
[0m[[0minfo[0m] [0mTest Passed: testMessageSizeCheck[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.consumer.TopicFilterTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testWhitelists[0m
[0m[[0minfo[0m] [0mTest Passed: testWhitelists[0m
[0m[[0minfo[0m] [0mTest Starting: testBlacklists[0m
[0m[[0minfo[0m] [0mTest Passed: testBlacklists[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.consumer.TopicFilterTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduce(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testMultiProduceWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetch(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndFetchWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchJava(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProduceAndMultiFetchJavaWithCompression(kafka.javaapi.integration.PrimitiveApiTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.integration.PrimitiveApiTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testBasic(kafka.javaapi.consumer.ZookeeperConsumerConnectorTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.consumer.ZookeeperConsumerConnectorTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.ByteBufferMessageSetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Passed: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testWriteTo[0m
[0m[[0minfo[0m] [0mTest Passed: testWriteTo[0m
[0m[[0minfo[0m] [0mTest Starting: testSmallFetchSize[0m
[0m[[0minfo[0m] [0mTest Passed: testSmallFetchSize[0m
[0m[[0minfo[0m] [0mTest Starting: testValidBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testValidBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testEquals[0m
[0m[[0minfo[0m] [0mTest Passed: testEquals[0m
[0m[[0minfo[0m] [0mTest Starting: testIterator[0m
[0m[[0minfo[0m] [0mTest Passed: testIterator[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.ByteBufferMessageSetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.AutoOffsetResetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testEarliestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testEarliestOffsetResetBackward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Starting: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testLatestOffsetResetForward(kafka.integration.AutoOffsetResetTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.AutoOffsetResetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Passed: testWrittenEqualsRead[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistent[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testValidBytes[0m
[0m[[0minfo[0m] [0mTest Passed: testValidBytes[0m
[0m[[0minfo[0m] [0mTest Starting: testEquals[0m
[0m[[0minfo[0m] [0mTest Passed: testEquals[0m
[0m[[0minfo[0m] [0mTest Starting: testIteratorIsConsistentWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testIteratorIsConsistentWithCompression[0m
[0m[[0minfo[0m] [0mTest Starting: testSizeInBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testSizeInBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Starting: testValidBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testValidBytesWithCompression[0m
[0m[[0minfo[0m] [0mTest Starting: testEqualsWithCompression[0m
[0m[[0minfo[0m] [0mTest Passed: testEqualsWithCompression[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.message.ByteBufferMessageSetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.server.ServerShutdownTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testCleanShutdown[0m
[0m[[0minfo[0m] [0mTest Passed: testCleanShutdown[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.server.ServerShutdownTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.SyncProducerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testReachableServer[0m
[0m[[0minfo[0m] [0mTest Passed: testReachableServer[0m
[0m[[0minfo[0m] [0mTest Starting: testSingleMessageSizeTooLarge[0m
[0m[[0minfo[0m] [0mTest Passed: testSingleMessageSizeTooLarge[0m
[0m[[0minfo[0m] [0mTest Starting: testCompressedMessageSizeTooLarge[0m
[0m[[0minfo[0m] [0mTest Passed: testCompressedMessageSizeTooLarge[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.producer.SyncProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogOffsetTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testEmptyLogs[0m
[0m[[0minfo[0m] [0mTest Passed: testEmptyLogs[0m
[0m[[0minfo[0m] [0mTest Starting: testGetOffsetsBeforeLatestTime[0m
[0m[[0minfo[0m] [0mTest Passed: testGetOffsetsBeforeLatestTime[0m
[0m[[0minfo[0m] [0mTest Starting: testEmptyLogsGetOffsets[0m
[0m[[0minfo[0m] [0mTest Passed: testEmptyLogsGetOffsets[0m
[0m[[0minfo[0m] [0mTest Starting: testGetOffsetsBeforeNow[0m
[0m[[0minfo[0m] [0mTest Passed: testGetOffsetsBeforeNow[0m
[0m[[0minfo[0m] [0mTest Starting: testGetOffsetsBeforeEarliestTime[0m
[0m[[0minfo[0m] [0mTest Passed: testGetOffsetsBeforeEarliestTime[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.log.LogOffsetTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.zk.ZKEphemeralTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testEphemeralNodeCleanup(kafka.zk.ZKEphemeralTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.zk.ZKEphemeralTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.BackwardsCompatibilityTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testProtocolVersion0(kafka.integration.BackwardsCompatibilityTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.integration.BackwardsCompatibilityTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.MessageTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testFieldValues[0m
[0m[[0minfo[0m] [0mTest Passed: testFieldValues[0m
[0m[[0minfo[0m] [0mTest Starting: testChecksum[0m
[0m[[0minfo[0m] [0mTest Passed: testChecksum[0m
[0m[[0minfo[0m] [0mTest Starting: testEquality[0m
[0m[[0minfo[0m] [0mTest Passed: testEquality[0m
[0m[[0minfo[0m] [0mTest Starting: testIsHashable[0m
[0m[[0minfo[0m] [0mTest Passed: testIsHashable[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.message.MessageTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.producer.ProducerTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testSend[0m
[0m[[0minfo[0m] [0mTest Passed: testSend[0m
[0m[[0minfo[0m] [0mTest Starting: testSendSingleMessage[0m
[0m[[0minfo[0m] [0mTest Passed: testSendSingleMessage[0m
[0m[[0minfo[0m] [0mTest Starting: testInvalidPartition[0m
[0m[[0minfo[0m] [0mTest Passed: testInvalidPartition[0m
[0m[[0minfo[0m] [0mTest Starting: testSyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Passed: testSyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Starting: testAsyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Passed: testAsyncProducerPool[0m
[0m[[0minfo[0m] [0mTest Starting: testSyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Passed: testSyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Starting: testAsyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Passed: testAsyncUnavailableProducerException[0m
[0m[[0minfo[0m] [0mTest Starting: testConfigBrokerPartitionInfoWithPartitioner[0m
[0m[[0minfo[0m] [0mTest Passed: testConfigBrokerPartitionInfoWithPartitioner[0m
[0m[[0minfo[0m] [0mTest Starting: testConfigBrokerPartitionInfo[0m
[0m[[0minfo[0m] [0mTest Passed: testConfigBrokerPartitionInfo[0m
[0m[[0minfo[0m] [0mTest Starting: testZKSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Passed: testZKSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testZKSendWithDeadBroker[0m
[0m[[0minfo[0m] [0mTest Passed: testZKSendWithDeadBroker[0m
[0m[[0minfo[0m] [0mTest Starting: testPartitionedSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Passed: testPartitionedSendToNewTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testPartitionedSendToNewBrokerInExistingTopic[0m
[0m[[0minfo[0m] [0mTest Passed: testPartitionedSendToNewBrokerInExistingTopic[0m
[0m[[0minfo[0m] [0mTest Starting: testDefaultPartitioner[0m
[0m[[0minfo[0m] [0mTest Passed: testDefaultPartitioner[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.javaapi.producer.ProducerTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.consumer.FetcherTest ==[0m
[0m[[0minfo[0m] [0mTest Starting: testFetcher(kafka.consumer.FetcherTest)[0m
[0m[[0minfo[0m] [0mTest Passed: testFetcher(kafka.consumer.FetcherTest)[0m
[0m[[0minfo[0m] [34m== core-kafka / kafka.consumer.FetcherTest ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [0mDeleting directory /tmp/sbt_aa7c472d[0m
[0m[[0minfo[0m] [34m== core-kafka / Test cleanup 1 ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[31merror[0m] [0mFailed: : Total 141, Failed 1, Errors 0, Passed 140, Skipped 0[0m
[0m[[0minfo[0m] [34m== core-kafka / test-finish ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m== core-kafka / test-cleanup ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== perf / test-compile ==[0m
[0m[[0minfo[0m] [0m Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.[0m
[0m[[0minfo[0m] [0mCompiling test sources...[0m
[0m[[0minfo[0m] [0mNothing to compile.[0m
[0m[[0minfo[0m] [0m Post-analysis: 0 classes.[0m
[0m[[0minfo[0m] [34m== perf / test-compile ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-test-resources ==[0m
[0m[[0minfo[0m] [34m[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-resources ==[0m
[0m[[0minfo[0m] [34m== hadoop consumer / copy-resources ==[0m
[0m[[31merror[0m] [0mError running kafka.producer.ProducerTest: Test FAILED[0m
[0m[[31merror[0m] [0mError running compile: javac returned nonzero exit code[0m
[0m[[31merror[0m] [0mError running compile: javac returned nonzero exit code[0m
[0m[[31merror[0m] [0mError running test: One or more subtasks failed[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal time: 137 s, completed Sep 18, 2012 6:04:25 AM[0m
[0m[[0minfo[0m] [0m[0m
[0m[[0minfo[0m] [0mTotal session time: 137 s, completed Sep 18, 2012 6:04:25 AM[0m
[0m[[31merror[0m] [0mError during build.[0m
Build step 'Execute shell' marked build as failure