You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kafka.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/04/28 12:11:18 UTC
Build failed in Jenkins: kafka-trunk-jdk7 #2129
See <https://builds.apache.org/job/kafka-trunk-jdk7/2129/display/redirect?page=changes>
Changes:
[jason] KAFKA-5097; Add testFetchAfterPartitionWithFetchedRecordsIsUnassigned
[ismael] MINOR: Make assignment expectation explicit in
------------------------------------------
[...truncated 1.63 MB...]
kafka.log.LogTest > testParseTopicPartitionNameForMissingPartition STARTED
kafka.log.LogTest > testParseTopicPartitionNameForMissingPartition PASSED
kafka.log.LogTest > testParseTopicPartitionNameForEmptyName STARTED
kafka.log.LogTest > testParseTopicPartitionNameForEmptyName PASSED
kafka.log.LogTest > testOpenDeletesObsoleteFiles STARTED
kafka.log.LogTest > testOpenDeletesObsoleteFiles PASSED
kafka.log.LogTest > testUpdatePidMapWithCompactedData STARTED
kafka.log.LogTest > testUpdatePidMapWithCompactedData PASSED
kafka.log.LogTest > shouldUpdateOffsetForLeaderEpochsWhenDeletingSegments STARTED
kafka.log.LogTest > shouldUpdateOffsetForLeaderEpochsWhenDeletingSegments PASSED
kafka.log.LogTest > testPeriodicPidSnapshot STARTED
kafka.log.LogTest > testPeriodicPidSnapshot PASSED
kafka.log.LogTest > testRebuildTimeIndexForOldMessages STARTED
kafka.log.LogTest > testRebuildTimeIndexForOldMessages PASSED
kafka.log.LogTest > testLogRecoversForLeaderEpoch STARTED
kafka.log.LogTest > testLogRecoversForLeaderEpoch PASSED
kafka.log.LogTest > testSizeBasedLogRoll STARTED
kafka.log.LogTest > testSizeBasedLogRoll PASSED
kafka.log.LogTest > shouldNotDeleteSizeBasedSegmentsWhenUnderRetentionSize STARTED
kafka.log.LogTest > shouldNotDeleteSizeBasedSegmentsWhenUnderRetentionSize PASSED
kafka.log.LogTest > testTimeBasedLogRollJitter STARTED
kafka.log.LogTest > testTimeBasedLogRollJitter PASSED
kafka.log.LogTest > testParseTopicPartitionName STARTED
kafka.log.LogTest > testParseTopicPartitionName PASSED
kafka.log.LogTest > testPidMapTruncateTo STARTED
kafka.log.LogTest > testPidMapTruncateTo PASSED
kafka.log.LogTest > testTruncateTo STARTED
kafka.log.LogTest > testTruncateTo PASSED
kafka.log.LogTest > shouldApplyEpochToMessageOnAppendIfLeader STARTED
kafka.log.LogTest > shouldApplyEpochToMessageOnAppendIfLeader PASSED
kafka.log.LogTest > testCleanShutdownFile STARTED
kafka.log.LogTest > testCleanShutdownFile PASSED
kafka.log.LogTest > testPidExpirationOnSegmentDeletion STARTED
kafka.log.LogTest > testPidExpirationOnSegmentDeletion PASSED
kafka.log.LogTest > testBuildTimeIndexWhenNotAssigningOffsets STARTED
kafka.log.LogTest > testBuildTimeIndexWhenNotAssigningOffsets PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[0] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[0] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[1] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[1] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[2] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[2] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[3] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[3] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[4] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[4] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[5] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[5] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[6] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[6] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[7] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[7] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[8] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[8] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[9] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[9] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[10] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[10] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[11] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[11] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[12] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[12] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[13] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[13] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[14] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[14] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[15] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[15] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[16] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[16] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[17] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[17] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[18] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[18] PASSED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[19] STARTED
kafka.log.BrokerCompressionTest > testBrokerSideCompression[19] PASSED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[0] STARTED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[0] PASSED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[0] STARTED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[0] PASSED
kafka.log.LogCleanerIntegrationTest > cleanerTest[0] STARTED
kafka.log.LogCleanerIntegrationTest > cleanerTest[0] PASSED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[0] STARTED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[0] PASSED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[1] STARTED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[1] PASSED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[1] STARTED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[1] PASSED
kafka.log.LogCleanerIntegrationTest > cleanerTest[1] STARTED
kafka.log.LogCleanerIntegrationTest > cleanerTest[1] PASSED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[1] STARTED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[1] PASSED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[2] STARTED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[2] PASSED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[2] STARTED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[2] PASSED
kafka.log.LogCleanerIntegrationTest > cleanerTest[2] STARTED
kafka.log.LogCleanerIntegrationTest > cleanerTest[2] PASSED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[2] STARTED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[2] PASSED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[3] STARTED
kafka.log.LogCleanerIntegrationTest > testCleansCombinedCompactAndDeleteTopic[3] PASSED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[3] STARTED
kafka.log.LogCleanerIntegrationTest > testCleaningNestedMessagesWithMultipleVersions[3] PASSED
kafka.log.LogCleanerIntegrationTest > cleanerTest[3] STARTED
kafka.log.LogCleanerIntegrationTest > cleanerTest[3] PASSED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[3] STARTED
kafka.log.LogCleanerIntegrationTest > testCleanerWithMessageFormatV0[3] PASSED
unit.kafka.coordinator.transaction.TransactionCoordinatorIntegrationTest > shouldCommitTransaction STARTED
unit.kafka.coordinator.transaction.TransactionCoordinatorIntegrationTest > shouldCommitTransaction PASSED
1279 tests completed, 1 failed
:kafka-trunk-jdk7:core:test FAILED
:test_core_2_11 FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':kafka-trunk-jdk7:core:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk7/ws/core/build/reports/tests/test/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 1 hrs 41 mins 36.61 secs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk7/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 1 file in <https://builds.apache.org/job/kafka-trunk-jdk7/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Computing warning deltas based on reference build #2127
Recording test results
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
Not sending mail to unregistered user wangguoz@gmail.com
Not sending mail to unregistered user ismael@juma.me.uk
Jenkins build is back to normal : kafka-trunk-jdk7 #2132
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk7/2132/display/redirect?page=changes>
Build failed in Jenkins: kafka-trunk-jdk7 #2131
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk7/2131/display/redirect?page=changes>
Changes:
[junrao] KAFKA-4954; Request handler utilization quotas
------------------------------------------
[...truncated 822.64 KB...]
java.lang.OutOfMemoryError: unable to create new native thread
java.lang.NullPointerException
kafka.server.ControlledShutdownLeaderSelectorTest > testSelectLeader STARTED
kafka.server.ControlledShutdownLeaderSelectorTest > testSelectLeader PASSED
kafka.server.HighwatermarkPersistenceTest > testHighWatermarkPersistenceMultiplePartitions STARTED
kafka.server.HighwatermarkPersistenceTest > testHighWatermarkPersistenceMultiplePartitions FAILED
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at kafka.server.ClientQuotaManager.start(ClientQuotaManager.scala:153)
at kafka.server.ClientQuotaManager.<init>(ClientQuotaManager.scala:151)
at kafka.server.QuotaFactory$.instantiate(QuotaFactory.scala:51)
at kafka.server.HighwatermarkPersistenceTest.testHighWatermarkPersistenceMultiplePartitions(HighwatermarkPersistenceTest.scala:108)
kafka.server.HighwatermarkPersistenceTest > testHighWatermarkPersistenceSinglePartition STARTED
kafka.server.HighwatermarkPersistenceTest > testHighWatermarkPersistenceSinglePartition FAILED
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at kafka.server.ClientQuotaManager.start(ClientQuotaManager.scala:153)
at kafka.server.ClientQuotaManager.<init>(ClientQuotaManager.scala:151)
at kafka.server.QuotaFactory$.instantiate(QuotaFactory.scala:51)
at kafka.server.HighwatermarkPersistenceTest.testHighWatermarkPersistenceSinglePartition(HighwatermarkPersistenceTest.scala:63)
kafka.server.FetchRequestTest > testBrokerRespectsPartitionsOrderAndSizeLimits STARTED
kafka.server.FetchRequestTest > testBrokerRespectsPartitionsOrderAndSizeLimits FAILED
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.zookeeper.server.ZooKeeperServer.setupRequestProcessors(ZooKeeperServer.java:431)
at org.apache.zookeeper.server.ZooKeeperServer.startup(ZooKeeperServer.java:419)
at org.apache.zookeeper.server.NIOServerCnxnFactory.startup(NIOServerCnxnFactory.java:119)
at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:36)
at kafka.zk.ZooKeeperTestHarness.setUp(ZooKeeperTestHarness.scala:44)
at kafka.integration.KafkaServerTestHarness.setUp(KafkaServerTestHarness.scala:83)
at kafka.server.FetchRequestTest.setUp(FetchRequestTest.scala:46)
java.lang.NullPointerException
at kafka.server.FetchRequestTest.tearDown(FetchRequestTest.scala:52)
kafka.server.FetchRequestTest > testFetchRequestV2WithOversizedMessage STARTED
kafka.server.FetchRequestTest > testFetchRequestV2WithOversizedMessage FAILED
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at org.apache.zookeeper.server.NIOServerCnxnFactory.start(NIOServerCnxnFactory.java:109)
at org.apache.zookeeper.server.NIOServerCnxnFactory.startup(NIOServerCnxnFactory.java:116)
at kafka.zk.EmbeddedZookeeper.<init>(EmbeddedZookeeper.scala:36)
at kafka.zk.ZooKeeperTestHarness.setUp(ZooKeeperTestHarness.scala:44)
at kafka.integration.KafkaServerTestHarness.setUp(KafkaServerTestHarness.scala:83)
at kafka.server.FetchRequestTest.setUp(FetchRequestTest.scala:46)
java.lang.NullPointerException
at kafka.server.FetchRequestTest.tearDown(FetchRequestTest.scala:52)
kafka.server.MetadataRequestTest > testReplicaDownResponse STARTED
kafka.server.MetadataRequestTest > testReplicaDownResponse FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testRack STARTED
kafka.server.MetadataRequestTest > testRack FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testIsInternal STARTED
kafka.server.MetadataRequestTest > testIsInternal FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testControllerId STARTED
kafka.server.MetadataRequestTest > testControllerId FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testAllTopicsRequest STARTED
kafka.server.MetadataRequestTest > testAllTopicsRequest FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testClusterIdIsValid STARTED
kafka.server.MetadataRequestTest > testClusterIdIsValid FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testNoTopicsRequest STARTED
kafka.server.MetadataRequestTest > testNoTopicsRequest FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.MetadataRequestTest > testClusterIdWithRequestVersion1 STARTED
kafka.server.MetadataRequestTest > testClusterIdWithRequestVersion1 FAILED
java.lang.OutOfMemoryError: unable to create new native thread
kafka.server.RequestQuotaTest > testUnauthorizedThrottle STARTED
kafka.server.RequestQuotaTest > testUnauthorizedThrottle FAILED
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at kafka.server.ClientQuotaManager.start(ClientQuotaManager.scala:153)
at kafka.server.ClientQuotaManager.<init>(ClientQuotaManager.scala:151)
at kafka.server.QuotaFactory$.instantiate(QuotaFactory.scala:50)
at kafka.server.KafkaServer.startup(KafkaServer.scala:207)
at kafka.utils.TestUtils$.createServer(TestUtils.scala:126)
at kafka.integration.KafkaServerTestHarness$$anonfun$setUp$1.apply(KafkaServerTestHarness.scala:91)
at kafka.integration.KafkaServerTestHarness$$anonfun$setUp$1.apply(KafkaServerTestHarness.scala:91)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at kafka.integration.KafkaServerTestHarness.setUp(KafkaServerTestHarness.scala:91)
at kafka.server.RequestQuotaTest.setUp(RequestQuotaTest.scala:68)
kafka.server.RequestQuotaTest > testUnthrottledClient STARTED
kafka.server.RequestQuotaTest > testUnthrottledClient FAILED
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at kafka.network.Acceptor$$anonfun$6.apply(SocketServer.scala:260)
at kafka.network.Acceptor$$anonfun$6.apply(SocketServer.scala:258)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at kafka.network.Acceptor.<init>(SocketServer.scala:258)
at kafka.network.SocketServer$$anonfun$startup$1.apply(SocketServer.scala:98)
at kafka.network.SocketServer$$anonfun$startup$1.apply(SocketServer.scala:90)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at kafka.network.SocketServer.startup(SocketServer.scala:90)
at kafka.server.KafkaServer.startup(KafkaServer.scala:218)
at kafka.utils.TestUtils$.createServer(TestUtils.scala:126)
at kafka.integration.KafkaServerTestHarness$$anonfun$setUp$1.apply(KafkaServerTestHarness.scala:91)
at kafka.integration.KafkaServerTestHarness$$anonfun$setUp$1.apply(KafkaServerTestHarness.scala:91)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at kafka.integration.KafkaServerTestHarness.setUp(KafkaServerTestHarness.scala:91)
at kafka.server.RequestQuotaTest.setUp(RequestQuotaTest.scala:68)
kafka.server.RequestQuotaTest > testExemptRequestTime STARTED
ERROR: Could not install GRADLE_3_4_RC_2_HOME
java.lang.NullPointerException
at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:930)
at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:418)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:624)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:589)
at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:392)
at hudson.scm.SCM.poll(SCM.java:409)
at hudson.model.AbstractProject._poll(AbstractProject.java:1463)
at hudson.model.AbstractProject.poll(AbstractProject.java:1366)
at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:596)
at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:642)
at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
ERROR: Could not install GRADLE_3_4_RC_2_HOME
java.lang.NullPointerException
at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:930)
at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:418)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:624)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:589)
at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:392)
at hudson.scm.SCM.poll(SCM.java:409)
at hudson.model.AbstractProject._poll(AbstractProject.java:1463)
at hudson.model.AbstractProject.poll(AbstractProject.java:1366)
at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:596)
at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:642)
at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
ERROR: Could not install GRADLE_3_4_RC_2_HOME
java.lang.NullPointerException
at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:930)
at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:418)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:624)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:589)
at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:392)
at hudson.scm.SCM.poll(SCM.java:409)
at hudson.model.AbstractProject._poll(AbstractProject.java:1463)
at hudson.model.AbstractProject.poll(AbstractProject.java:1366)
at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:596)
at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:642)
at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
ERROR: Could not install GRADLE_3_4_RC_2_HOME
java.lang.NullPointerException
at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:930)
at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:418)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:624)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:589)
at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:392)
at hudson.scm.SCM.poll(SCM.java:409)
at hudson.model.AbstractProject._poll(AbstractProject.java:1463)
at hudson.model.AbstractProject.poll(AbstractProject.java:1366)
at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:596)
at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:642)
at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Build timed out (after 240 minutes). Marking the build as failed.
Build was aborted
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk7/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 1 file in <https://builds.apache.org/job/kafka-trunk-jdk7/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Computing warning deltas based on reference build #2127
Recording test results
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
Not sending mail to unregistered user ismael@juma.me.uk
Not sending mail to unregistered user wangguoz@gmail.com
Build failed in Jenkins: kafka-trunk-jdk7 #2130
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk7/2130/display/redirect?page=changes>
Changes:
[jason] KAFKA-4208; Add Record Headers
------------------------------------------
[...truncated 2.40 MB...]
org.apache.kafka.streams.state.internals.StoreChangeLoggerTest > testAddRemove PASSED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldRemove STARTED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldRemove PASSED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldPutAndFetch STARTED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldPutAndFetch PASSED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldRollSegments STARTED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldRollSegments PASSED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldFindValuesWithinRange STARTED
org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStoreTest > shouldFindValuesWithinRange PASSED
org.apache.kafka.streams.state.internals.SessionKeySchemaTest > shouldFetchExactKeySkippingShorterKeys STARTED
org.apache.kafka.streams.state.internals.SessionKeySchemaTest > shouldFetchExactKeySkippingShorterKeys PASSED
org.apache.kafka.streams.state.internals.SessionKeySchemaTest > shouldFetchExactKeysSkippingLongerKeys STARTED
org.apache.kafka.streams.state.internals.SessionKeySchemaTest > shouldFetchExactKeysSkippingLongerKeys PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldReturnValueIfExists STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldReturnValueIfExists PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldNotGetValuesFromOtherStores STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldNotGetValuesFromOtherStores PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldGetApproximateEntriesAcrossAllStores STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldGetApproximateEntriesAcrossAllStores PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldReturnLongMaxValueOnOverflow STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldReturnLongMaxValueOnOverflow PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldReturnNullIfKeyDoesntExist STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldReturnNullIfKeyDoesntExist PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldThrowInvalidStoreExceptionDuringRebalance STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldThrowInvalidStoreExceptionDuringRebalance PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldThrowInvalidStoreExceptionOnAllDuringRebalance STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldThrowInvalidStoreExceptionOnAllDuringRebalance PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldSupportRange STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldSupportRange PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldFindValueForKeyWhenMultiStores STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldFindValueForKeyWhenMultiStores PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldThrowInvalidStoreExceptionOnRangeDuringRebalance STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldThrowInvalidStoreExceptionOnRangeDuringRebalance PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldSupportRangeAcrossMultipleKVStores STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldSupportRangeAcrossMultipleKVStores PASSED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldSupportAllAcrossMultipleStores STARTED
org.apache.kafka.streams.state.internals.CompositeReadOnlyKeyValueStoreTest > shouldSupportAllAcrossMultipleStores PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldCleanupSegmentsThatHaveExpired STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldCleanupSegmentsThatHaveExpired PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldBaseSegmentIntervalOnRetentionAndNumSegments STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldBaseSegmentIntervalOnRetentionAndNumSegments PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldCloseAllOpenSegments STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldCloseAllOpenSegments PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentsWithinTimeRange STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentsWithinTimeRange PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldNotCreateSegmentThatIsAlreadyExpired STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldNotCreateSegmentThatIsAlreadyExpired PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldCreateSegments STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldCreateSegments PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldOpenExistingSegments STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldOpenExistingSegments PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentIdsFromTimestamp STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentIdsFromTimestamp PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldRollSegments STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldRollSegments PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentNameFromId STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentNameFromId PASSED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentForTimestamp STARTED
org.apache.kafka.streams.state.internals.SegmentsTest > shouldGetSegmentForTimestamp PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFetchExactKeys STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFetchExactKeys PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldRemove STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldRemove PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFindValuesWithinMergingSessionWindowRange STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFindValuesWithinMergingSessionWindowRange PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFetchAllSessionsWithSameRecordKey STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFetchAllSessionsWithSameRecordKey PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFindSessionsToMerge STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldFindSessionsToMerge PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldPutAndFindSessionsInRange STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreTest > shouldPutAndFindSessionsInRange PASSED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldFlushUnderlyingStore STARTED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldFlushUnderlyingStore PASSED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldCloseUnderlyingStore STARTED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldCloseUnderlyingStore PASSED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldLogPuts STARTED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldLogPuts PASSED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldInitUnderlyingStore STARTED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldInitUnderlyingStore PASSED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldLogRemoves STARTED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldLogRemoves PASSED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldDelegateToUnderlyingStoreWhenFetching STARTED
org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStoreTest > shouldDelegateToUnderlyingStoreWhenFetching PASSED
org.apache.kafka.streams.state.internals.WindowStoreUtilsTest > testSerialization STARTED
org.apache.kafka.streams.state.internals.WindowStoreUtilsTest > testSerialization PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldHaveMeteredStoreWhenNotLoggedOrCached STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldHaveMeteredStoreWhenNotLoggedOrCached PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldReturnRocksDbStoreWhenCachingDisabled STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldReturnRocksDbStoreWhenCachingDisabled PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldCreateLoggingEnabledStoreWhenStoreLogged STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldCreateLoggingEnabledStoreWhenStoreLogged PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldReturnCachedSessionStoreWhenCachingEnabled STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldReturnCachedSessionStoreWhenCachingEnabled PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldNotBeLoggingEnabledStoreWhenLoggingNotEnabled STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldNotBeLoggingEnabledStoreWhenLoggingNotEnabled PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldHaveMeteredStoreWhenCached STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldHaveMeteredStoreWhenCached PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldHaveMeteredStoreWhenLogged STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldHaveMeteredStoreWhenLogged PASSED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldReturnRocksDbStoreWhenCachingAndLoggingDisabled STARTED
org.apache.kafka.streams.state.internals.RocksDBSessionStoreSupplierTest > shouldReturnRocksDbStoreWhenCachingAndLoggingDisabled PASSED
org.apache.kafka.streams.state.StoresTest > shouldCreateInMemoryStoreSupplierWithLoggedConfig STARTED
org.apache.kafka.streams.state.StoresTest > shouldCreateInMemoryStoreSupplierWithLoggedConfig PASSED
org.apache.kafka.streams.state.StoresTest > shouldCreatePersistenStoreSupplierNotLogged STARTED
org.apache.kafka.streams.state.StoresTest > shouldCreatePersistenStoreSupplierNotLogged PASSED
org.apache.kafka.streams.state.StoresTest > shouldCreatePersistenStoreSupplierWithLoggedConfig STARTED
org.apache.kafka.streams.state.StoresTest > shouldCreatePersistenStoreSupplierWithLoggedConfig PASSED
org.apache.kafka.streams.state.StoresTest > shouldCreateInMemoryStoreSupplierNotLogged STARTED
org.apache.kafka.streams.state.StoresTest > shouldCreateInMemoryStoreSupplierNotLogged PASSED
1059 tests completed, 1 failed
:streams:test FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':streams:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk7/ws/streams/build/reports/tests/test/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 2 hrs 10 mins 37.16 secs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk7/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 5 files in <https://builds.apache.org/job/kafka-trunk-jdk7/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/clients/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/examples/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/log4j-appender/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk7/ws/streams/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Computing warning deltas based on reference build #2127
Recording test results
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
Setting GRADLE_3_4_RC_2_HOME=/home/jenkins/jenkins-slave/tools/hudson.plugins.gradle.GradleInstallation/Gradle_3.4-rc-2
Not sending mail to unregistered user ismael@juma.me.uk
Not sending mail to unregistered user wangguoz@gmail.com