You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kafka.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/01/22 05:18:11 UTC
Build failed in Jenkins: kafka-trunk-jdk8 #2343
See <https://builds.apache.org/job/kafka-trunk-jdk8/2343/display/redirect?page=changes>
Changes:
[me] KAFKA-6277: Ensure loadClass for plugin class loaders is thread-safe.
------------------------------------------
[...truncated 411.26 KB...]
kafka.utils.CoreUtilsTest > testReadInt STARTED
kafka.utils.CoreUtilsTest > testReadInt PASSED
kafka.utils.CoreUtilsTest > testAtomicGetOrUpdate STARTED
kafka.utils.CoreUtilsTest > testAtomicGetOrUpdate PASSED
kafka.utils.CoreUtilsTest > testUrlSafeBase64EncodeUUID STARTED
kafka.utils.CoreUtilsTest > testUrlSafeBase64EncodeUUID PASSED
kafka.utils.CoreUtilsTest > testCsvMap STARTED
kafka.utils.CoreUtilsTest > testCsvMap PASSED
kafka.utils.CoreUtilsTest > testInLock STARTED
kafka.utils.CoreUtilsTest > testInLock PASSED
kafka.utils.CoreUtilsTest > testTryAll STARTED
kafka.utils.CoreUtilsTest > testTryAll PASSED
kafka.utils.CoreUtilsTest > testSwallow STARTED
kafka.utils.CoreUtilsTest > testSwallow PASSED
kafka.utils.IteratorTemplateTest > testIterator STARTED
kafka.utils.IteratorTemplateTest > testIterator PASSED
kafka.utils.json.JsonValueTest > testJsonObjectIterator STARTED
kafka.utils.json.JsonValueTest > testJsonObjectIterator PASSED
kafka.utils.json.JsonValueTest > testDecodeLong STARTED
kafka.utils.json.JsonValueTest > testDecodeLong PASSED
kafka.utils.json.JsonValueTest > testAsJsonObject STARTED
kafka.utils.json.JsonValueTest > testAsJsonObject PASSED
kafka.utils.json.JsonValueTest > testDecodeDouble STARTED
kafka.utils.json.JsonValueTest > testDecodeDouble PASSED
kafka.utils.json.JsonValueTest > testDecodeOption STARTED
kafka.utils.json.JsonValueTest > testDecodeOption PASSED
kafka.utils.json.JsonValueTest > testDecodeString STARTED
kafka.utils.json.JsonValueTest > testDecodeString PASSED
kafka.utils.json.JsonValueTest > testJsonValueToString STARTED
kafka.utils.json.JsonValueTest > testJsonValueToString PASSED
kafka.utils.json.JsonValueTest > testAsJsonObjectOption STARTED
kafka.utils.json.JsonValueTest > testAsJsonObjectOption PASSED
kafka.utils.json.JsonValueTest > testAsJsonArrayOption STARTED
kafka.utils.json.JsonValueTest > testAsJsonArrayOption PASSED
kafka.utils.json.JsonValueTest > testAsJsonArray STARTED
kafka.utils.json.JsonValueTest > testAsJsonArray PASSED
kafka.utils.json.JsonValueTest > testJsonValueHashCode STARTED
kafka.utils.json.JsonValueTest > testJsonValueHashCode PASSED
kafka.utils.json.JsonValueTest > testDecodeInt STARTED
kafka.utils.json.JsonValueTest > testDecodeInt PASSED
kafka.utils.json.JsonValueTest > testDecodeMap STARTED
kafka.utils.json.JsonValueTest > testDecodeMap PASSED
kafka.utils.json.JsonValueTest > testDecodeSeq STARTED
kafka.utils.json.JsonValueTest > testDecodeSeq PASSED
kafka.utils.json.JsonValueTest > testJsonObjectGet STARTED
kafka.utils.json.JsonValueTest > testJsonObjectGet PASSED
kafka.utils.json.JsonValueTest > testJsonValueEquals STARTED
kafka.utils.json.JsonValueTest > testJsonValueEquals PASSED
kafka.utils.json.JsonValueTest > testJsonArrayIterator STARTED
kafka.utils.json.JsonValueTest > testJsonArrayIterator PASSED
kafka.utils.json.JsonValueTest > testJsonObjectApply STARTED
kafka.utils.json.JsonValueTest > testJsonObjectApply PASSED
kafka.utils.json.JsonValueTest > testDecodeBoolean STARTED
kafka.utils.json.JsonValueTest > testDecodeBoolean PASSED
kafka.producer.AsyncProducerTest > testFailedSendRetryLogic STARTED
kafka.producer.AsyncProducerTest > testFailedSendRetryLogic PASSED
kafka.producer.AsyncProducerTest > testQueueTimeExpired STARTED
kafka.producer.AsyncProducerTest > testQueueTimeExpired PASSED
kafka.producer.AsyncProducerTest > testPartitionAndCollateEvents STARTED
kafka.producer.AsyncProducerTest > testPartitionAndCollateEvents PASSED
kafka.producer.AsyncProducerTest > testBatchSize STARTED
kafka.producer.AsyncProducerTest > testBatchSize PASSED
kafka.producer.AsyncProducerTest > testSerializeEvents STARTED
kafka.producer.AsyncProducerTest > testSerializeEvents PASSED
kafka.producer.AsyncProducerTest > testProducerQueueSize STARTED
kafka.producer.AsyncProducerTest > testProducerQueueSize PASSED
kafka.producer.AsyncProducerTest > testRandomPartitioner STARTED
kafka.producer.AsyncProducerTest > testRandomPartitioner PASSED
kafka.producer.AsyncProducerTest > testInvalidConfiguration STARTED
kafka.producer.AsyncProducerTest > testInvalidConfiguration PASSED
kafka.producer.AsyncProducerTest > testInvalidPartition STARTED
kafka.producer.AsyncProducerTest > testInvalidPartition PASSED
kafka.producer.AsyncProducerTest > testNoBroker STARTED
kafka.producer.AsyncProducerTest > testNoBroker PASSED
kafka.producer.AsyncProducerTest > testProduceAfterClosed STARTED
kafka.producer.AsyncProducerTest > testProduceAfterClosed PASSED
kafka.producer.AsyncProducerTest > testJavaProducer STARTED
kafka.producer.AsyncProducerTest > testJavaProducer PASSED
kafka.producer.AsyncProducerTest > testIncompatibleEncoder STARTED
kafka.producer.AsyncProducerTest > testIncompatibleEncoder PASSED
kafka.producer.SyncProducerTest > testReachableServer STARTED
kafka.producer.SyncProducerTest > testReachableServer PASSED
kafka.producer.SyncProducerTest > testMessageSizeTooLarge STARTED
kafka.producer.SyncProducerTest > testMessageSizeTooLarge PASSED
kafka.producer.SyncProducerTest > testNotEnoughReplicas STARTED
kafka.producer.SyncProducerTest > testNotEnoughReplicas PASSED
kafka.producer.SyncProducerTest > testMessageSizeTooLargeWithAckZero STARTED
kafka.producer.SyncProducerTest > testMessageSizeTooLargeWithAckZero PASSED
kafka.producer.SyncProducerTest > testProducerCanTimeout STARTED
kafka.producer.SyncProducerTest > testProducerCanTimeout PASSED
kafka.producer.SyncProducerTest > testProduceRequestWithNoResponse STARTED
kafka.producer.SyncProducerTest > testProduceRequestWithNoResponse PASSED
kafka.producer.SyncProducerTest > testEmptyProduceRequest STARTED
kafka.producer.SyncProducerTest > testEmptyProduceRequest PASSED
kafka.producer.SyncProducerTest > testProduceCorrectlyReceivesResponse STARTED
kafka.producer.SyncProducerTest > testProduceCorrectlyReceivesResponse PASSED
kafka.producer.ProducerTest > testSendToNewTopic STARTED
kafka.producer.ProducerTest > testSendToNewTopic PASSED
kafka.producer.ProducerTest > testAsyncSendCanCorrectlyFailWithTimeout STARTED
kafka.producer.ProducerTest > testAsyncSendCanCorrectlyFailWithTimeout PASSED
kafka.producer.ProducerTest > testSendNullMessage STARTED
kafka.producer.ProducerTest > testSendNullMessage PASSED
kafka.producer.ProducerTest > testUpdateBrokerPartitionInfo STARTED
kafka.producer.ProducerTest > testUpdateBrokerPartitionInfo PASSED
kafka.producer.ProducerTest > testSendWithDeadBroker STARTED
kafka.producer.ProducerTest > testSendWithDeadBroker PASSED
1794 tests completed, 1 failed, 6 skipped
:kafka-trunk-jdk8:core:test FAILED
:test_core_2_11 FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':kafka-trunk-jdk8:core:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/tests/test/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1h 5m 20s
16 actionable tasks: 15 executed, 1 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 1 file in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=0fa52644debfbc20cda0a93678140537fa2cb24c, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #2342
Recording test results
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
Jenkins build is back to normal : kafka-trunk-jdk8 #2349
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/2349/display/redirect?page=changes>
Build failed in Jenkins: kafka-trunk-jdk8 #2348
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/2348/display/redirect>
------------------------------------------
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on ubuntu-eu2 (ubuntu trusty) in workspace <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/apache/kafka.git # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from https://github.com/apache/kafka.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:825)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1092)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1123)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1724)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
Caused by: hudson.plugins.git.GitException: Command "git config remote.origin.url https://github.com/apache/kafka.git" returned status code 4:
stdout:
stderr: error: failed to write new configuration file .git/config.lock
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1970)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1938)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1934)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1572)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1584)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.setRemoteUrl(CliGitAPIImpl.java:1218)
at hudson.plugins.git.GitAPI.setRemoteUrl(GitAPI.java:160)
at sun.reflect.GeneratedMethodAccessor69.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:922)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:896)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:853)
at hudson.remoting.UserRequest.perform(UserRequest.java:207)
at hudson.remoting.UserRequest.perform(UserRequest.java:53)
at hudson.remoting.Request$2.run(Request.java:358)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to ubuntu-eu2
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1693)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:310)
at hudson.remoting.Channel.call(Channel.java:908)
at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:281)
at com.sun.proxy.$Proxy110.setRemoteUrl(Unknown Source)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl.setRemoteUrl(RemoteGitImpl.java:295)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:813)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1092)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1123)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1724)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
ERROR: Error fetching remote repo 'origin'
Retrying after 10 seconds
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/apache/kafka.git # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from https://github.com/apache/kafka.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:825)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1092)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1123)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1724)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
Caused by: hudson.plugins.git.GitException: Command "git config remote.origin.url https://github.com/apache/kafka.git" returned status code 4:
stdout:
stderr: error: failed to write new configuration file .git/config.lock
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1970)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1938)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1934)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1572)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1584)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.setRemoteUrl(CliGitAPIImpl.java:1218)
at hudson.plugins.git.GitAPI.setRemoteUrl(GitAPI.java:160)
at sun.reflect.GeneratedMethodAccessor69.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:922)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:896)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:853)
at hudson.remoting.UserRequest.perform(UserRequest.java:207)
at hudson.remoting.UserRequest.perform(UserRequest.java:53)
at hudson.remoting.Request$2.run(Request.java:358)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to ubuntu-eu2
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1693)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:310)
at hudson.remoting.Channel.call(Channel.java:908)
at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:281)
at com.sun.proxy.$Proxy110.setRemoteUrl(Unknown Source)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl.setRemoteUrl(RemoteGitImpl.java:295)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:813)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1092)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1123)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1724)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
ERROR: Error fetching remote repo 'origin'
Retrying after 10 seconds
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://github.com/apache/kafka.git # timeout=10
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from https://github.com/apache/kafka.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:825)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1092)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1123)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1724)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
Caused by: hudson.plugins.git.GitException: Command "git config remote.origin.url https://github.com/apache/kafka.git" returned status code 4:
stdout:
stderr: error: failed to write new configuration file .git/config.lock
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1970)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1938)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:1934)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1572)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommand(CliGitAPIImpl.java:1584)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.setRemoteUrl(CliGitAPIImpl.java:1218)
at hudson.plugins.git.GitAPI.setRemoteUrl(GitAPI.java:160)
at sun.reflect.GeneratedMethodAccessor69.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:922)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:896)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.call(RemoteInvocationHandler.java:853)
at hudson.remoting.UserRequest.perform(UserRequest.java:207)
at hudson.remoting.UserRequest.perform(UserRequest.java:53)
at hudson.remoting.Request$2.run(Request.java:358)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Suppressed: hudson.remoting.Channel$CallSiteStackTrace: Remote call to ubuntu-eu2
at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1693)
at hudson.remoting.UserResponse.retrieve(UserRequest.java:310)
at hudson.remoting.Channel.call(Channel.java:908)
at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:281)
at com.sun.proxy.$Proxy110.setRemoteUrl(Unknown Source)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl.setRemoteUrl(RemoteGitImpl.java:295)
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:813)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1092)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1123)
at hudson.scm.SCM.checkout(SCM.java:495)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1202)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1724)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:421)
ERROR: Error fetching remote repo 'origin'
[FINDBUGS] Collecting findbugs analysis files...
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 2 files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=5e04fb8d8e55a5281bd40da9e1c99e9eb5e3c5f5, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #2342
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: Test reports were found but none of them are new. Did leafNodes run?
For example, <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/test-results/test/TEST-org.apache.kafka.clients.ApiVersionsTest.xml> is 17 hr old
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user wangguoz@gmail.com
Build failed in Jenkins: kafka-trunk-jdk8 #2347
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/2347/display/redirect?page=changes>
Changes:
[github] KAFKA-6287; Consumer group command should list simple consumer groups
------------------------------------------
[...truncated 3.40 MB...]
kafka.utils.CoreUtilsTest > testReadInt PASSED
kafka.utils.CoreUtilsTest > testAtomicGetOrUpdate STARTED
kafka.utils.CoreUtilsTest > testAtomicGetOrUpdate PASSED
kafka.utils.CoreUtilsTest > testUrlSafeBase64EncodeUUID STARTED
kafka.utils.CoreUtilsTest > testUrlSafeBase64EncodeUUID PASSED
kafka.utils.CoreUtilsTest > testCsvMap STARTED
kafka.utils.CoreUtilsTest > testCsvMap PASSED
kafka.utils.CoreUtilsTest > testInLock STARTED
kafka.utils.CoreUtilsTest > testInLock PASSED
kafka.utils.CoreUtilsTest > testTryAll STARTED
kafka.utils.CoreUtilsTest > testTryAll PASSED
kafka.utils.CoreUtilsTest > testSwallow STARTED
kafka.utils.CoreUtilsTest > testSwallow PASSED
kafka.utils.IteratorTemplateTest > testIterator STARTED
kafka.utils.IteratorTemplateTest > testIterator PASSED
kafka.utils.json.JsonValueTest > testJsonObjectIterator STARTED
kafka.utils.json.JsonValueTest > testJsonObjectIterator PASSED
kafka.utils.json.JsonValueTest > testDecodeLong STARTED
kafka.utils.json.JsonValueTest > testDecodeLong PASSED
kafka.utils.json.JsonValueTest > testAsJsonObject STARTED
kafka.utils.json.JsonValueTest > testAsJsonObject PASSED
kafka.utils.json.JsonValueTest > testDecodeDouble STARTED
kafka.utils.json.JsonValueTest > testDecodeDouble PASSED
kafka.utils.json.JsonValueTest > testDecodeOption STARTED
kafka.utils.json.JsonValueTest > testDecodeOption PASSED
kafka.utils.json.JsonValueTest > testDecodeString STARTED
kafka.utils.json.JsonValueTest > testDecodeString PASSED
kafka.utils.json.JsonValueTest > testJsonValueToString STARTED
kafka.utils.json.JsonValueTest > testJsonValueToString PASSED
kafka.utils.json.JsonValueTest > testAsJsonObjectOption STARTED
kafka.utils.json.JsonValueTest > testAsJsonObjectOption PASSED
kafka.utils.json.JsonValueTest > testAsJsonArrayOption STARTED
kafka.utils.json.JsonValueTest > testAsJsonArrayOption PASSED
kafka.utils.json.JsonValueTest > testAsJsonArray STARTED
kafka.utils.json.JsonValueTest > testAsJsonArray PASSED
kafka.utils.json.JsonValueTest > testJsonValueHashCode STARTED
kafka.utils.json.JsonValueTest > testJsonValueHashCode PASSED
kafka.utils.json.JsonValueTest > testDecodeInt STARTED
kafka.utils.json.JsonValueTest > testDecodeInt PASSED
kafka.utils.json.JsonValueTest > testDecodeMap STARTED
kafka.utils.json.JsonValueTest > testDecodeMap PASSED
kafka.utils.json.JsonValueTest > testDecodeSeq STARTED
kafka.utils.json.JsonValueTest > testDecodeSeq PASSED
kafka.utils.json.JsonValueTest > testJsonObjectGet STARTED
kafka.utils.json.JsonValueTest > testJsonObjectGet PASSED
kafka.utils.json.JsonValueTest > testJsonValueEquals STARTED
kafka.utils.json.JsonValueTest > testJsonValueEquals PASSED
kafka.utils.json.JsonValueTest > testJsonArrayIterator STARTED
kafka.utils.json.JsonValueTest > testJsonArrayIterator PASSED
kafka.utils.json.JsonValueTest > testJsonObjectApply STARTED
kafka.utils.json.JsonValueTest > testJsonObjectApply PASSED
kafka.utils.json.JsonValueTest > testDecodeBoolean STARTED
kafka.utils.json.JsonValueTest > testDecodeBoolean PASSED
kafka.producer.AsyncProducerTest > testFailedSendRetryLogic STARTED
kafka.producer.AsyncProducerTest > testFailedSendRetryLogic PASSED
kafka.producer.AsyncProducerTest > testQueueTimeExpired STARTED
kafka.producer.AsyncProducerTest > testQueueTimeExpired PASSED
kafka.producer.AsyncProducerTest > testPartitionAndCollateEvents STARTED
kafka.producer.AsyncProducerTest > testPartitionAndCollateEvents PASSED
kafka.producer.AsyncProducerTest > testBatchSize STARTED
kafka.producer.AsyncProducerTest > testBatchSize PASSED
kafka.producer.AsyncProducerTest > testSerializeEvents STARTED
kafka.producer.AsyncProducerTest > testSerializeEvents PASSED
kafka.producer.AsyncProducerTest > testProducerQueueSize STARTED
kafka.producer.AsyncProducerTest > testProducerQueueSize PASSED
kafka.producer.AsyncProducerTest > testRandomPartitioner STARTED
kafka.producer.AsyncProducerTest > testRandomPartitioner PASSED
kafka.producer.AsyncProducerTest > testInvalidConfiguration STARTED
kafka.producer.AsyncProducerTest > testInvalidConfiguration PASSED
kafka.producer.AsyncProducerTest > testInvalidPartition STARTED
kafka.producer.AsyncProducerTest > testInvalidPartition PASSED
kafka.producer.AsyncProducerTest > testNoBroker STARTED
kafka.producer.AsyncProducerTest > testNoBroker PASSED
kafka.producer.AsyncProducerTest > testProduceAfterClosed STARTED
kafka.producer.AsyncProducerTest > testProduceAfterClosed PASSED
kafka.producer.AsyncProducerTest > testJavaProducer STARTED
kafka.producer.AsyncProducerTest > testJavaProducer PASSED
kafka.producer.AsyncProducerTest > testIncompatibleEncoder STARTED
kafka.producer.AsyncProducerTest > testIncompatibleEncoder PASSED
kafka.producer.SyncProducerTest > testReachableServer STARTED
kafka.producer.SyncProducerTest > testReachableServer PASSED
kafka.producer.SyncProducerTest > testMessageSizeTooLarge STARTED
kafka.producer.SyncProducerTest > testMessageSizeTooLarge PASSED
kafka.producer.SyncProducerTest > testNotEnoughReplicas STARTED
kafka.producer.SyncProducerTest > testNotEnoughReplicas PASSED
kafka.producer.SyncProducerTest > testMessageSizeTooLargeWithAckZero STARTED
kafka.producer.SyncProducerTest > testMessageSizeTooLargeWithAckZero PASSED
kafka.producer.SyncProducerTest > testProducerCanTimeout STARTED
kafka.producer.SyncProducerTest > testProducerCanTimeout PASSED
kafka.producer.SyncProducerTest > testProduceRequestWithNoResponse STARTED
kafka.producer.SyncProducerTest > testProduceRequestWithNoResponse PASSED
kafka.producer.SyncProducerTest > testEmptyProduceRequest STARTED
kafka.producer.SyncProducerTest > testEmptyProduceRequest PASSED
kafka.producer.SyncProducerTest > testProduceCorrectlyReceivesResponse STARTED
kafka.producer.SyncProducerTest > testProduceCorrectlyReceivesResponse PASSED
kafka.producer.ProducerTest > testSendToNewTopic STARTED
kafka.producer.ProducerTest > testSendToNewTopic PASSED
kafka.producer.ProducerTest > testAsyncSendCanCorrectlyFailWithTimeout STARTED
kafka.producer.ProducerTest > testAsyncSendCanCorrectlyFailWithTimeout PASSED
kafka.producer.ProducerTest > testSendNullMessage STARTED
kafka.producer.ProducerTest > testSendNullMessage PASSED
kafka.producer.ProducerTest > testUpdateBrokerPartitionInfo STARTED
kafka.producer.ProducerTest > testUpdateBrokerPartitionInfo PASSED
kafka.producer.ProducerTest > testSendWithDeadBroker STARTED
kafka.producer.ProducerTest > testSendWithDeadBroker PASSED
1804 tests completed, 1 failed, 6 skipped
:core:test FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':core:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/tests/test/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1h 9m 48s
31 actionable tasks: 28 executed, 3 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 2 files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=5e04fb8d8e55a5281bd40da9e1c99e9eb5e3c5f5, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #2342
Recording test results
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user wangguoz@gmail.com
Build failed in Jenkins: kafka-trunk-jdk8 #2346
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/2346/display/redirect?page=changes>
Changes:
[wangguoz] KAFKA-6461 TableTableJoinIntegrationTest is unstable if caching is
------------------------------------------
[...truncated 1.85 MB...]
org.apache.kafka.streams.kstream.internals.KTableAggregateTest > testRemoveOldBeforeAddNew STARTED
org.apache.kafka.streams.kstream.internals.KTableAggregateTest > testRemoveOldBeforeAddNew PASSED
org.apache.kafka.streams.kstream.internals.KTableAggregateTest > testCountCoalesced STARTED
org.apache.kafka.streams.kstream.internals.KTableAggregateTest > testCountCoalesced PASSED
org.apache.kafka.streams.kstream.internals.KTableForeachTest > testForeach STARTED
org.apache.kafka.streams.kstream.internals.KTableForeachTest > testForeach PASSED
org.apache.kafka.streams.kstream.internals.KTableForeachTest > testTypeVariance STARTED
org.apache.kafka.streams.kstream.internals.KTableForeachTest > testTypeVariance PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > retentionTimeShouldBeGapIfGapIsLargerThanDefaultRetentionTime STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > retentionTimeShouldBeGapIfGapIsLargerThanDefaultRetentionTime PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldSetWindowGap STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldSetWindowGap PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldBeEqualWhenGapAndMaintainMsAreTheSame STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldBeEqualWhenGapAndMaintainMsAreTheSame PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > retentionTimeMustNotBeNegative STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > retentionTimeMustNotBeNegative PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldNotBeEqualWhenMaintainMsDifferent STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldNotBeEqualWhenMaintainMsDifferent PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > windowSizeMustNotBeZero STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > windowSizeMustNotBeZero PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > windowSizeMustNotBeNegative STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > windowSizeMustNotBeNegative PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldSetWindowRetentionTime STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldSetWindowRetentionTime PASSED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldNotBeEqualWhenGapIsDifferent STARTED
org.apache.kafka.streams.kstream.SessionWindowsTest > shouldNotBeEqualWhenGapIsDifferent PASSED
org.apache.kafka.streams.kstream.WindowTest > shouldThrowIfEndIsSmallerThanStart STARTED
org.apache.kafka.streams.kstream.WindowTest > shouldThrowIfEndIsSmallerThanStart PASSED
org.apache.kafka.streams.kstream.WindowTest > shouldNotBeEqualIfDifferentWindowType STARTED
org.apache.kafka.streams.kstream.WindowTest > shouldNotBeEqualIfDifferentWindowType PASSED
org.apache.kafka.streams.kstream.WindowTest > shouldBeEqualIfStartAndEndSame STARTED
org.apache.kafka.streams.kstream.WindowTest > shouldBeEqualIfStartAndEndSame PASSED
org.apache.kafka.streams.kstream.WindowTest > shouldNotBeEqualIfNull STARTED
org.apache.kafka.streams.kstream.WindowTest > shouldNotBeEqualIfNull PASSED
org.apache.kafka.streams.kstream.WindowTest > shouldThrowIfStartIsNegative STARTED
org.apache.kafka.streams.kstream.WindowTest > shouldThrowIfStartIsNegative PASSED
org.apache.kafka.streams.kstream.WindowTest > shouldNotBeEqualIfStartOrEndIsDifferent STARTED
org.apache.kafka.streams.kstream.WindowTest > shouldNotBeEqualIfStartOrEndIsDifferent PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddRegexTopicToLatestAutoOffsetResetList STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddRegexTopicToLatestAutoOffsetResetList PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldMapStateStoresToCorrectSourceTopics STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldMapStateStoresToCorrectSourceTopics PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTableToEarliestAutoOffsetResetList STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTableToEarliestAutoOffsetResetList PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToTablePerSource STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToTablePerSource PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldBuildGlobalTopologyWithAllGlobalTables STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldBuildGlobalTopologyWithAllGlobalTables PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddRegexTopicToEarliestAutoOffsetResetList STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddRegexTopicToEarliestAutoOffsetResetList PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTopicToEarliestAutoOffsetResetList STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTopicToEarliestAutoOffsetResetList PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldHaveCorrectSourceTopicsForTableFromMergedStream STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldHaveCorrectSourceTopicsForTableFromMergedStream PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testNewStoreName STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testNewStoreName PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testMerge STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testMerge PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testFrom STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testFrom PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTopicToLatestAutoOffsetResetList STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTopicToLatestAutoOffsetResetList PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > kStreamTimestampExtractorShouldBeNull STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > kStreamTimestampExtractorShouldBeNull PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldBuildSimpleGlobalTableTopology STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldBuildSimpleGlobalTableTopology PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldProcessFromSinkTopic STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldProcessFromSinkTopic PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTableToLatestAutoOffsetResetList STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTableToLatestAutoOffsetResetList PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldNotAddTableToOffsetResetLists STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldNotAddTableToOffsetResetLists PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToStreamWithKeyValSerdePerSource STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToStreamWithKeyValSerdePerSource PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToTableWithKeyValSerdePerSource STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToTableWithKeyValSerdePerSource PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddGlobalTablesToEachGroup STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddGlobalTablesToEachGroup PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldNotAddRegexTopicsToOffsetResetLists STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldNotAddRegexTopicsToOffsetResetLists PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldThrowExceptionWhenTopicNamesAreNull STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldThrowExceptionWhenTopicNamesAreNull PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldThrowExceptionWhenNoTopicPresent STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldThrowExceptionWhenNoTopicPresent PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > kTableTimestampExtractorShouldBeNull STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > kTableTimestampExtractorShouldBeNull PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testNewName STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > testNewName PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldBuildGlobalTopologyWithAllGlobalTablesWithInternalStoreName STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldBuildGlobalTopologyWithAllGlobalTablesWithInternalStoreName PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldStillMaterializeSourceKTableIfStateNameNotSpecified STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldStillMaterializeSourceKTableIfStateNameNotSpecified PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToStreamWithOffsetResetPerSource STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldAddTimestampExtractorToStreamWithOffsetResetPerSource PASSED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldProcessViaThroughTopic STARTED
org.apache.kafka.streams.kstream.KStreamBuilderTest > shouldProcessViaThroughTopic PASSED
org.apache.kafka.streams.kstream.WindowsTest > retentionTimeMustNotBeNegative STARTED
org.apache.kafka.streams.kstream.WindowsTest > retentionTimeMustNotBeNegative PASSED
org.apache.kafka.streams.kstream.WindowsTest > numberOfSegmentsMustBeAtLeastTwo STARTED
org.apache.kafka.streams.kstream.WindowsTest > numberOfSegmentsMustBeAtLeastTwo PASSED
org.apache.kafka.streams.kstream.WindowsTest > shouldSetWindowRetentionTime STARTED
org.apache.kafka.streams.kstream.WindowsTest > shouldSetWindowRetentionTime PASSED
org.apache.kafka.streams.kstream.WindowsTest > shouldSetNumberOfSegments STARTED
org.apache.kafka.streams.kstream.WindowsTest > shouldSetNumberOfSegments PASSED
1847 tests completed, 8 failed
:streams:test FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':streams:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/tests/test/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1h 53m 55s
52 actionable tasks: 31 executed, 21 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 5 files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/examples/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/log4j-appender/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=faafdbe014ca6ac7e253be6934f78885642666ed, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #2342
Recording test results
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
Not sending mail to unregistered user wangguoz@gmail.com
Build failed in Jenkins: kafka-trunk-jdk8 #2345
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/2345/display/redirect?page=changes>
Changes:
[jason] KAFKA-6241; Enable dynamic updates of broker SSL keystore (#4263)
------------------------------------------
[...truncated 123.92 KB...]
kafka.tools.ConsoleProducerTest > testValidConfigsOldProducer PASSED
kafka.tools.ConsoleProducerTest > testInvalidConfigs STARTED
kafka.tools.ConsoleProducerTest > testInvalidConfigs PASSED
kafka.tools.ConsoleProducerTest > testValidConfigsNewProducer STARTED
kafka.tools.ConsoleProducerTest > testValidConfigsNewProducer PASSED
kafka.tools.ReplicaVerificationToolTest > testReplicaBufferVerifyChecksum STARTED
kafka.tools.ReplicaVerificationToolTest > testReplicaBufferVerifyChecksum PASSED
kafka.zk.ZKEphemeralTest > testOverlappingSessions[0] STARTED
kafka.zk.ZKEphemeralTest > testOverlappingSessions[0] PASSED
kafka.zk.ZKEphemeralTest > testEphemeralNodeCleanup[0] STARTED
kafka.zk.ZKEphemeralTest > testEphemeralNodeCleanup[0] PASSED
kafka.zk.ZKEphemeralTest > testZkWatchedEphemeral[0] STARTED
kafka.zk.ZKEphemeralTest > testZkWatchedEphemeral[0] PASSED
kafka.zk.ZKEphemeralTest > testSameSession[0] STARTED
kafka.zk.ZKEphemeralTest > testSameSession[0] PASSED
kafka.zk.ZKEphemeralTest > testOverlappingSessions[1] STARTED
kafka.zk.ZKEphemeralTest > testOverlappingSessions[1] PASSED
kafka.zk.ZKEphemeralTest > testEphemeralNodeCleanup[1] STARTED
kafka.zk.ZKEphemeralTest > testEphemeralNodeCleanup[1] PASSED
kafka.zk.ZKEphemeralTest > testZkWatchedEphemeral[1] STARTED
kafka.zk.ZKEphemeralTest > testZkWatchedEphemeral[1] PASSED
kafka.zk.ZKEphemeralTest > testSameSession[1] STARTED
kafka.zk.ZKEphemeralTest > testSameSession[1] PASSED
kafka.zk.KafkaZkClientTest > testBrokerRegistrationMethods STARTED
kafka.zk.KafkaZkClientTest > testBrokerRegistrationMethods PASSED
kafka.zk.KafkaZkClientTest > testSetGetAndDeletePartitionReassignment STARTED
kafka.zk.KafkaZkClientTest > testSetGetAndDeletePartitionReassignment PASSED
kafka.zk.KafkaZkClientTest > testGetDataAndVersion STARTED
kafka.zk.KafkaZkClientTest > testGetDataAndVersion PASSED
kafka.zk.KafkaZkClientTest > testGetChildren STARTED
kafka.zk.KafkaZkClientTest > testGetChildren PASSED
kafka.zk.KafkaZkClientTest > testSetAndGetConsumerOffset STARTED
kafka.zk.KafkaZkClientTest > testSetAndGetConsumerOffset PASSED
kafka.zk.KafkaZkClientTest > testClusterIdMethods STARTED
kafka.zk.KafkaZkClientTest > testClusterIdMethods PASSED
kafka.zk.KafkaZkClientTest > testEntityConfigManagementMethods STARTED
kafka.zk.KafkaZkClientTest > testEntityConfigManagementMethods PASSED
kafka.zk.KafkaZkClientTest > testCreateRecursive STARTED
kafka.zk.KafkaZkClientTest > testCreateRecursive PASSED
kafka.zk.KafkaZkClientTest > testGetConsumerOffsetNoData STARTED
kafka.zk.KafkaZkClientTest > testGetConsumerOffsetNoData PASSED
kafka.zk.KafkaZkClientTest > testDeleteTopicPathMethods STARTED
kafka.zk.KafkaZkClientTest > testDeleteTopicPathMethods PASSED
kafka.zk.KafkaZkClientTest > testAclManagementMethods STARTED
kafka.zk.KafkaZkClientTest > testAclManagementMethods PASSED
kafka.zk.KafkaZkClientTest > testPreferredReplicaElectionMethods STARTED
kafka.zk.KafkaZkClientTest > testPreferredReplicaElectionMethods PASSED
kafka.zk.KafkaZkClientTest > testPropagateLogDir STARTED
kafka.zk.KafkaZkClientTest > testPropagateLogDir PASSED
kafka.zk.KafkaZkClientTest > testGetDataAndStat STARTED
kafka.zk.KafkaZkClientTest > testGetDataAndStat PASSED
kafka.zk.KafkaZkClientTest > testCreateTopLevelPaths STARTED
kafka.zk.KafkaZkClientTest > testCreateTopLevelPaths PASSED
kafka.zk.KafkaZkClientTest > testBrokerSequenceIdMethods STARTED
kafka.zk.KafkaZkClientTest > testBrokerSequenceIdMethods PASSED
kafka.zk.KafkaZkClientTest > testCreateSequentialPersistentPath STARTED
kafka.zk.KafkaZkClientTest > testCreateSequentialPersistentPath PASSED
kafka.zk.KafkaZkClientTest > testConditionalUpdatePath STARTED
kafka.zk.KafkaZkClientTest > testConditionalUpdatePath PASSED
kafka.zk.KafkaZkClientTest > testTopicAssignmentMethods STARTED
kafka.zk.KafkaZkClientTest > testTopicAssignmentMethods PASSED
kafka.zk.KafkaZkClientTest > testPropagateIsrChanges STARTED
kafka.zk.KafkaZkClientTest > testPropagateIsrChanges PASSED
kafka.zk.KafkaZkClientTest > testDeleteRecursive STARTED
kafka.zk.KafkaZkClientTest > testDeleteRecursive PASSED
kafka.zk.KafkaZkClientTest > testDelegationTokenMethods STARTED
kafka.zk.KafkaZkClientTest > testDelegationTokenMethods PASSED
kafka.zk.ZKPathTest > testCreatePersistentSequentialThrowsException STARTED
kafka.zk.ZKPathTest > testCreatePersistentSequentialThrowsException PASSED
kafka.zk.ZKPathTest > testCreatePersistentSequentialExists STARTED
kafka.zk.ZKPathTest > testCreatePersistentSequentialExists PASSED
kafka.zk.ZKPathTest > testCreateEphemeralPathExists STARTED
kafka.zk.ZKPathTest > testCreateEphemeralPathExists PASSED
kafka.zk.ZKPathTest > testCreatePersistentPath STARTED
kafka.zk.ZKPathTest > testCreatePersistentPath PASSED
kafka.zk.ZKPathTest > testMakeSurePersistsPathExistsThrowsException STARTED
kafka.zk.ZKPathTest > testMakeSurePersistsPathExistsThrowsException PASSED
kafka.zk.ZKPathTest > testCreateEphemeralPathThrowsException STARTED
kafka.zk.ZKPathTest > testCreateEphemeralPathThrowsException PASSED
kafka.zk.ZKPathTest > testCreatePersistentPathThrowsException STARTED
kafka.zk.ZKPathTest > testCreatePersistentPathThrowsException PASSED
kafka.zk.ZKPathTest > testMakeSurePersistsPathExists STARTED
kafka.zk.ZKPathTest > testMakeSurePersistsPathExists PASSED
kafka.server.LogOffsetTest > testFetchOffsetsBeforeWithChangingSegmentSize STARTED
kafka.server.LogOffsetTest > testFetchOffsetsBeforeWithChangingSegmentSize PASSED
kafka.server.LogOffsetTest > testGetOffsetsBeforeEarliestTime STARTED
kafka.server.LogOffsetTest > testGetOffsetsBeforeEarliestTime PASSED
kafka.server.LogOffsetTest > testGetOffsetsForUnknownTopic STARTED
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
ERROR: Could not install GRADLE_3_5_HOME
java.lang.NullPointerException
at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:895)
at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:421)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:629)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:594)
at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:391)
at hudson.scm.SCM.poll(SCM.java:408)
at hudson.model.AbstractProject._poll(AbstractProject.java:1384)
at hudson.model.AbstractProject.poll(AbstractProject.java:1287)
at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:594)
at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:640)
at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
ERROR: Could not install GRADLE_3_5_HOME
java.lang.NullPointerException
at hudson.plugins.toolenv.ToolEnvBuildWrapper$1.buildEnvVars(ToolEnvBuildWrapper.java:46)
at hudson.model.AbstractBuild.getEnvironment(AbstractBuild.java:895)
at hudson.plugins.git.GitSCM.getParamExpandedRepos(GitSCM.java:421)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWithImpl(GitSCM.java:629)
at hudson.plugins.git.GitSCM.compareRemoteRevisionWith(GitSCM.java:594)
at hudson.scm.SCM.compareRemoteRevisionWith(SCM.java:391)
at hudson.scm.SCM.poll(SCM.java:408)
at hudson.model.AbstractProject._poll(AbstractProject.java:1384)
at hudson.model.AbstractProject.poll(AbstractProject.java:1287)
at hudson.triggers.SCMTrigger$Runner.runPolling(SCMTrigger.java:594)
at hudson.triggers.SCMTrigger$Runner.run(SCMTrigger.java:640)
at hudson.util.SequentialExecutionQueue$QueueEntry.run(SequentialExecutionQueue.java:119)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
ERROR: Could not install GRADLE_3_5_HOME
java.lang.NullPointerException
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
ERROR: Could not install GRADLE_3_5_HOME
java.lang.NullPointerException
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
ERROR: Could not install GRADLE_3_5_HOME
java.lang.NullPointerException
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
ERROR: Could not install GRADLE_3_5_HOME
java.lang.NullPointerException
Build timed out (after 360 minutes). Marking the build as failed.
Build was aborted
[FINDBUGS] Collecting findbugs analysis files...
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] Parsing 1 file in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/findbugs/main.xml> with 0 unique warnings and 0 duplicates.
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=b814a16b968d144802d08523b5c359d6706f5632, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #2342
Recording test results
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
Not sending mail to unregistered user wangguoz@gmail.com
Build failed in Jenkins: kafka-trunk-jdk8 #2344
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/2344/display/redirect?page=changes>
Changes:
[wangguoz] MINOR: update docs with regard to improved resilience of Kafka Streams
[wangguoz] MINOR: increase timeout for unstable
[me] MINOR: Add async and different sync startup modes in connect service
------------------------------------------
[...truncated 135.76 KB...]
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:277: error writing class TestPurgatoryPerformance$CompletionQueue$Scheduled: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$CompletionQueue$Scheduled.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
private class Scheduled(val operation: FakeOperation) extends Delayed {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:185: error writing class TestPurgatoryPerformance$ExponentialDistribution: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$ExponentialDistribution.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
private class ExponentialDistribution(lambda: Double) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:240: error writing class TestPurgatoryPerformance$FakeOperation: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$FakeOperation.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
private class FakeOperation(delayMs: Long, size: Int, val latencyMs: Long, latch: CountDownLatch) extends DelayedOperation(delayMs) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:217: error writing class TestPurgatoryPerformance$IntervalSamples: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$IntervalSamples.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
private class IntervalSamples(sampleSize: Int, requestPerSecond: Double) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:222: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$IntervalSamples$$anonfun$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
(0 until sampleSize).map { _ =>
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:235: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$IntervalSamples$$anonfun$printStats$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
.format(1000d / (samples.map(_.toDouble).sum / sampleSize.toDouble), samples.min, samples.max)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:196: error writing class TestPurgatoryPerformance$LatencySamples: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$LatencySamples.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
private class LatencySamples(sampleSize: Int, pct75: Double, pct50: Double) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:202: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$LatencySamples$$anonfun$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
(0 until sampleSize).map { _ => dist.next().toLong }.toArray
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestPurgatoryPerformance.scala>:175: error writing class TestPurgatoryPerformance$LogNormalDistribution: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestPurgatoryPerformance$LogNormalDistribution.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
private class LogNormalDistribution(mu: Double, sigma: Double) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/other/kafka/TestTruncate.scala>:24: error writing object TestTruncate: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/TestTruncate.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
object TestTruncate {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:29: error writing class AclCommandTest: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
class AclCommandTest extends ZooKeeperTestHarness with Logging {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:79: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
Array[String]("--producer", "--consumer") -> ConsumerResourceToAcls.map { case (k, v) => k -> (v ++
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:80: error writing <$anon: Function0>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$1$$anonfun$apply$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
ProducerResourceToAcls().getOrElse(k, Set.empty[Acl])) },
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:81: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
Array[String]("--producer", "--idempotent", "--consumer") -> ConsumerResourceToAcls.map { case (k, v) => k -> (v ++
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:82: error writing <$anon: Function0>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$2$$anonfun$apply$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
ProducerResourceToAcls(enableIdempotence = true).getOrElse(k, Set.empty[Acl])) }
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:67: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$kafka$admin$AclCommandTest$$ProducerResourceToAcls$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
if (enableIdempotence) Some(IdempotentWrite) else None).flatten, Hosts)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:150: error writing <$anon: Function2>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$kafka$admin$AclCommandTest$$getCmd$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
Users.foldLeft(cmd) ((cmd, user) => cmd ++ Array(principalCmd, user.toString))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:134: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$kafka$admin$AclCommandTest$$testRemove$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (resource <- resources) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:136: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$kafka$admin$AclCommandTest$$testRemove$1$$anonfun$apply$10.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
withAuthorizer(brokerProps) { authorizer =>
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:91: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testAclCli$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for ((resources, resourceCmd) <- ResourceToCommand) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:91: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testAclCli$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for ((resources, resourceCmd) <- ResourceToCommand) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:92: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testAclCli$2$$anonfun$apply$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (permissionType <- PermissionType.values) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:96: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testAclCli$2$$anonfun$apply$3$$anonfun$apply$4.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (resource <- resources) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:97: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testAclCli$2$$anonfun$apply$3$$anonfun$apply$4$$anonfun$apply$5.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
withAuthorizer(brokerProps) { authorizer =>
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:113: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for ((cmd, resourcesToAcls) <- CmdToResourcesToAcl) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:113: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for ((cmd, resourcesToAcls) <- CmdToResourcesToAcl) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:114: error writing <$anon: Function2>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$2$$anonfun$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val resourceCommand: Array[String] = resourcesToAcls.keys.map(ResourceToCommand).foldLeft(Array[String]())(_ ++ _)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:116: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$2$$anonfun$apply$6.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for ((resources, acls) <- resourcesToAcls) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:116: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$2$$anonfun$apply$7.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for ((resources, acls) <- resourcesToAcls) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:117: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$2$$anonfun$apply$7$$anonfun$apply$8.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (resource <- resources) {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AclCommandTest.scala>:118: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AclCommandTest$$anonfun$testProducerConsumerCli$2$$anonfun$apply$7$$anonfun$apply$8$$anonfun$apply$9.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
withAuthorizer(brokerProps) { authorizer =>
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:34: error writing class AddPartitionsTest: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
class AddPartitionsTest extends ZooKeeperTestHarness {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:112: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val metadata = ClientUtils.fetchTopicMetadata(Set(topic1), brokers.map(_.brokerEndPoint(listenerName)),
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:193: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$10.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val metaDataForTopic2 = metadata.find(p => p.topic == topic2).get
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:201: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$11.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val partitionOpt = metadata.partitionsMetadata.find(_.partitionId == partitionId)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:114: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val metaDataForTopic1 = metadata.filter(p => p.topic.equals(topic1))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:115: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val partitionDataForTopic1 = metaDataForTopic1.head.partitionsMetadata.sortBy(_.partitionId)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:141: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$4.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
brokers.map(_.brokerEndPoint(ListenerName.forSecurityProtocol(SecurityProtocol.PLAINTEXT))),
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:143: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$5.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val metaDataForTopic2 = metadata.filter(_.topic == topic2)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:144: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$6.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val partitionDataForTopic2 = metaDataForTopic2.head.partitionsMetadata.sortBy(_.partitionId)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:167: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$7.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
brokers.map(_.brokerEndPoint(ListenerName.forSecurityProtocol(SecurityProtocol.PLAINTEXT))),
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:170: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$8.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val metaDataForTopic3 = metadata.find(p => p.topic == topic3).get
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:190: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$9.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
brokers.map(_.brokerEndPoint(ListenerName.forSecurityProtocol(SecurityProtocol.PLAINTEXT))),
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:56: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$setUp$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
configs = (0 until 4).map(i => KafkaConfig.fromProps(TestUtils.createBrokerConfig(i, zkConnect, enableControlledShutdown = false)))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:58: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$setUp$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
servers = configs.map(c => TestUtils.createServer(c))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:59: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$setUp$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
brokers = servers.map(s => TestUtils.createBroker(s.config.brokerId, s.config.hostName, TestUtils.boundPort(s)))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AddPartitionsTest.scala>:208: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AddPartitionsTest$$anonfun$validateLeaderAndReplicas$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals("Replica set should match", expectedReplicas, partition.replicas.map(_.id).toSet)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:25: error writing class AdminRackAwareTest: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
class AdminRackAwareTest extends RackAwareTest with Logging {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:162: error writing <$anon: Function0>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testLessReplicasThanRacks$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(List.fill(assignment.size)(replicationFactor), assignment.values.map(_.size))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:164: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testLessReplicasThanRacks$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (partition <- 0 to 5)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:162: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testLessReplicasThanRacks$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(List.fill(assignment.size)(replicationFactor), assignment.values.map(_.size))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:149: error writing <$anon: Function0>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testMoreReplicasThanRacks$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(List.fill(assignment.size)(replicationFactor), assignment.values.map(_.size))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:151: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testMoreReplicasThanRacks$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (partition <- 0 until numPartitions)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:149: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testMoreReplicasThanRacks$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(List.fill(assignment.size)(replicationFactor), assignment.values.map(_.size))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:174: error writing <$anon: Function0>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testSingleRack$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(List.fill(assignment.size)(replicationFactor), assignment.values.map(_.size))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:176: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testSingleRack$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (partition <- 0 until numPartitions)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:178: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testSingleRack$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
for (broker <- brokerRackMapping.keys)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:174: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testSingleRack$4.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(List.fill(assignment.size)(replicationFactor), assignment.values.map(_.size))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminRackAwareTest.scala>:189: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminRackAwareTest$$anonfun$testSkipBrokerWithReplicaAlreadyAssigned$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
assertEquals(brokerList, brokerMetadatas.map(_.id))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:46: error writing class AdminTest: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
class AdminTest extends ZooKeeperTestHarness with Logging with RackAwareTest {
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:58: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest$$anonfun$1.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val brokerMetadatas = (0 to 4).map(new BrokerMetadata(_, None))
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:146: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest$$anonfun$2.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val actualReplicaList = leaderForPartitionMap.keys.toArray.map(p => p -> zkUtils.getReplicasForPartition(topic, p)).toMap
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:344: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest$$anonfun$3.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val serverConfigs = TestUtils.createBrokerConfigs(3, zkConnect, false, rackInfo = brokerRack).map(KafkaConfig.fromProps)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:363: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest$$anonfun$4.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val serverConfigs = TestUtils.createBrokerConfigs(3, zkConnect, false).map(KafkaConfig.fromProps)
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:369: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest$$anonfun$5.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val controller = servers.find(p => p.config.brokerId == controllerId).get.kafkaController
^
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/src/test/scala/unit/kafka/admin/AdminTest.scala>:371: error writing <$anon: Function1>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka/admin/AdminTest$$anonfun$6.class>: <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/classes/scala/test/kafka> is not a directory
val controlledShutdownCallback = (controlledShutdownResult: Try[Set[TopicPartition]]) => resultQueue.put(controlledShutdownResult)
^
202 warnings found
3827 errors found
:kafka-trunk-jdk8:core:compileTestScala FAILED
:test_core_2_11 FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':kafka-trunk-jdk8:core:compileTestScala'.
> Compilation failed
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 2m 24s
16 actionable tasks: 3 executed, 13 up-to-date
FAILURE: Build failed with an exception.
* What went wrong:
Could not add entry ':core:compileTestScala' to cache taskHistory.bin (<https://builds.apache.org/job/kafka-trunk-jdk8/ws/.gradle/4.4.1/taskHistory/taskHistory.bin).>
> java.io.IOException: No space left on device
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 2m 25s
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/findbugs/*.xml
[FINDBUGS] No files found. Configuration error?
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=83cc138e0c04a2f30f4536c27314890d06818190, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #2342
Recording test results
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
ERROR: No tool found matching GRADLE_3_4_RC_2_HOME
Setting GRADLE_3_5_HOME=/home/jenkins/tools/gradle/3.5
Not sending mail to unregistered user wangguoz@gmail.com