You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kafka.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/03/08 03:11:00 UTC

Build failed in Jenkins: kafka-trunk-jdk8 #3445

See <https://builds.apache.org/job/kafka-trunk-jdk8/3445/display/redirect?page=changes>

Changes:

[jason] KAFKA-8061; Handle concurrent ProducerId reset and call to Sender thread

[bbejeck] Improve API docs of (flatT|t)ransform (#6365)

[wangguoz] Minor resolve streams scala warnings (#6369)

------------------------------------------
[...truncated 2.34 MB...]

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testTopicsRegex STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testTopicsRegex PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testPollRedelivery STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testPollRedelivery PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testStartPaused STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testStartPaused PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testPause STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testPause PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testLongRunningCommitWithoutTimeout STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testLongRunningCommitWithoutTimeout PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testCommitWithOutOfOrderCallback STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testCommitWithOutOfOrderCallback PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testMissingTimestampPropagation STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testMissingTimestampPropagation PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testPreCommit STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testPreCommit PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testErrorInRebalancePartitionRevocation STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testErrorInRebalancePartitionRevocation PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testMetricsGroup STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testMetricsGroup PASSED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testIgnoredCommit STARTED

org.apache.kafka.connect.runtime.WorkerSinkTaskTest > testIgnoredCommit PASSED

> Task :streams:streams-scala:test

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should create a Produced with Serdes STARTED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should create a Produced with Serdes PASSED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized should create a Materialized with Serdes STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized should create a Materialized with Serdes PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a store name should create a Materialized with Serdes and a store name STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a store name should create a Materialized with Serdes and a store name PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a window store supplier should create a Materialized with Serdes and a store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a window store supplier should create a Materialized with Serdes and a store supplier PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a key value store supplier should create a Materialized with Serdes and a store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a key value store supplier should create a Materialized with Serdes and a store supplier PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a session store supplier should create a Materialized with Serdes and a store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a session store supplier should create a Materialized with Serdes and a store supplier PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should filter records satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should filter records satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should filter records not satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should filter records not satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should run foreach actions on records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should run foreach actions on records PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run peek actions on records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run peek actions on records PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should select a new key STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should select a new key PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should join correctly records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should join correctly records PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should filter records satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should filter records satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should filter records not satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should filter records not satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join correctly records STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join correctly records PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a Materialized should join correctly records and state store STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a Materialized should join correctly records and state store PASSED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should create a Grouped with Serdes STARTED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should create a Grouped with Serdes PASSED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with repartition topic name should create a Grouped with Serdes, and repartition topic name STARTED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with repartition topic name should create a Grouped with Serdes, and repartition topic name PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should create a Consumed with Serdes STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should create a Consumed with Serdes PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor should create a Consumed with Serdes and timestampExtractor STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor should create a Consumed with Serdes and timestampExtractor PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with resetPolicy should create a Consumed with Serdes and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with resetPolicy should create a Consumed with Serdes and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes STARTED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes PASSED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes and repartition topic name STARTED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes and repartition topic name PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsJava STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsJava PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWords STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWords PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaJoin STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaJoin PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaSimple STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaSimple PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaAggregate STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaAggregate PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaTransform STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaTransform PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionWithNamedRepartitionTopic STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionWithNamedRepartitionTopic PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionJava STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionJava PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegion STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegion PASSED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':streams:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':core:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.1.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 43m 24s
173 actionable tasks: 160 executed, 13 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/*bugs/*.xml
[FINDBUGS] Parsing 17 files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/api/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/basic-auth-extension/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/file/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/json/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/runtime/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/transforms/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/examples/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/generator/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/jmh-benchmarks/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/log4j-appender/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/examples/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/streams-scala/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/test-utils/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/tools/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
No credentials specified
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=ccd3af15669d38bc8dba3a376a23e9615faf98a2, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #3444
Recording test results
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Not sending mail to unregistered user wangguoz@gmail.com

Jenkins build is back to normal : kafka-trunk-jdk8 #3448

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/3448/display/redirect?page=changes>


Build failed in Jenkins: kafka-trunk-jdk8 #3447

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/3447/display/redirect?page=changes>

Changes:

[manikumar] KAFKA-8060: The Kafka protocol generator should allow null defaults

------------------------------------------
[...truncated 4.67 MB...]

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldPutAllWithUnknownTimestamp STARTED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldPutAllWithUnknownTimestamp PASSED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldReturnIsPersistent STARTED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldReturnIsPersistent PASSED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldPutIfAbsentWithUnknownTimestamp STARTED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldPutIfAbsentWithUnknownTimestamp PASSED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldForwardClose STARTED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldForwardClose PASSED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldForwardFlush STARTED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldForwardFlush PASSED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldForwardInit STARTED

org.apache.kafka.streams.internals.KeyValueStoreFacadeTest > shouldForwardInit PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldReturnIsOpen STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldReturnIsOpen PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldReturnName STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldReturnName PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldPutWithUnknownTimestamp STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldPutWithUnknownTimestamp PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldPutWindowStartTimestampWithUnknownTimestamp STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldPutWindowStartTimestampWithUnknownTimestamp PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldReturnIsPersistent STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldReturnIsPersistent PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldForwardClose STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldForwardClose PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldForwardFlush STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldForwardFlush PASSED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldForwardInit STARTED

org.apache.kafka.streams.internals.WindowStoreFacadeTest > shouldForwardInit PASSED

> Task :streams:streams-scala:test

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionWithNamedRepartitionTopic STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionWithNamedRepartitionTopic PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionJava STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegionJava PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegion STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes > testShouldCountClicksPerRegion PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaJoin STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaJoin PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaSimple STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaSimple PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaAggregate STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaAggregate PASSED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaTransform STARTED

org.apache.kafka.streams.scala.TopologyTest > shouldBuildIdenticalTopologyInJavaNScalaTransform PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsJava STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsJava PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWords STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWords PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized should create a Materialized with Serdes STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized should create a Materialized with Serdes PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a store name should create a Materialized with Serdes and a store name STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a store name should create a Materialized with Serdes and a store name PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a window store supplier should create a Materialized with Serdes and a store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a window store supplier should create a Materialized with Serdes and a store supplier PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a key value store supplier should create a Materialized with Serdes and a store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a key value store supplier should create a Materialized with Serdes and a store supplier PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a session store supplier should create a Materialized with Serdes and a store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize with a session store supplier should create a Materialized with Serdes and a store supplier PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should filter records satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should filter records satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should filter records not satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should filter records not satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join correctly records STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join correctly records PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a Materialized should join correctly records and state store STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a Materialized should join correctly records and state store PASSED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should create a Grouped with Serdes STARTED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should create a Grouped with Serdes PASSED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with repartition topic name should create a Grouped with Serdes, and repartition topic name STARTED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with repartition topic name should create a Grouped with Serdes, and repartition topic name PASSED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should create a Produced with Serdes STARTED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should create a Produced with Serdes PASSED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should filter records satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should filter records satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should filter records not satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should filter records not satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should run foreach actions on records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should run foreach actions on records PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run peek actions on records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run peek actions on records PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should select a new key STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should select a new key PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should join correctly records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should join correctly records PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should create a Consumed with Serdes STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should create a Consumed with Serdes PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor and resetPolicy should create a Consumed with Serdes, timestampExtractor and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor should create a Consumed with Serdes and timestampExtractor STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with timestampExtractor should create a Consumed with Serdes and timestampExtractor PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with resetPolicy should create a Consumed with Serdes and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with resetPolicy should create a Consumed with Serdes and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes STARTED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes PASSED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes and repartition topic name STARTED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should create a Joined with Serdes and repartition topic name PASSED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':clients:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.1.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 54m 3s
173 actionable tasks: 170 executed, 3 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/*bugs/*.xml
[FINDBUGS] Parsing 17 files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/api/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/basic-auth-extension/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/file/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/json/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/runtime/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/transforms/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/examples/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/generator/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/jmh-benchmarks/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/log4j-appender/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/examples/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/streams-scala/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/test-utils/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/tools/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
No credentials specified
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=027cbbaec521542f53274183daccc2073e91cfe9, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #3444
Recording test results
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Not sending mail to unregistered user wangguoz@gmail.com
Not sending mail to unregistered user noreply@github.com

Build failed in Jenkins: kafka-trunk-jdk8 #3446

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/kafka-trunk-jdk8/3446/display/redirect?page=changes>

Changes:

[github] KAFKA-7831; Do not modify subscription state from background thread

------------------------------------------
[...truncated 4.68 MB...]
org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSourceSpecificDeserializers[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStores[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStores[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSendRecordViaCorrectSourceTopic[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSendRecordViaCorrectSourceTopic[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotCreateStateDirectoryForStatelessTopology[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotCreateStateDirectoryForStatelessTopology[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStoresNames[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStoresNames[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessConsumerRecordList[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessConsumerRecordList[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSinkSpecificSerializers[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSinkSpecificSerializers[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldFlushStoreForFirstInput[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldFlushStoreForFirstInput[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessFromSourceThatMatchPattern[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessFromSourceThatMatchPattern[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUpdateStoreForNewKey[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUpdateStoreForNewKey[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateOnWallClockTime[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateOnWallClockTime[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSetRecordMetadata[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSetRecordMetadata[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForLargerValue[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForLargerValue[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessRecordForTopic[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessRecordForTopic[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldForwardRecordsFromSubtopologyToSubtopology[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldForwardRecordsFromSubtopologyToSubtopology[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForSmallerValue[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForSmallerValue[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCreateStateDirectoryForStatefulTopology[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCreateStateDirectoryForStatefulTopology[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateIfWallClockTimeAdvances[Eos enabled = true] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateIfWallClockTimeAdvances[Eos enabled = true] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCloseProcessor[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCloseProcessor[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldFeedStoreFromGlobalKTable[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldFeedStoreFromGlobalKTable[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCleanUpPersistentStateStoresOnClose[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCleanUpPersistentStateStoresOnClose[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldThrowPatternNotValidForTopicNameException[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldThrowPatternNotValidForTopicNameException[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateIfEvenTimeAdvances[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateIfEvenTimeAdvances[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldInitProcessor[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldInitProcessor[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldThrowForUnknownTopic[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldThrowForUnknownTopic[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateOnStreamsTime[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateOnStreamsTime[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessFromSourcesThatMatchMultiplePattern[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessFromSourcesThatMatchMultiplePattern[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPopulateGlobalStore[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPopulateGlobalStore[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldAllowPrePopulatingStatesStoresWithCachingEnabled[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldAllowPrePopulatingStatesStoresWithCachingEnabled[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSourceSpecificDeserializers[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSourceSpecificDeserializers[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStores[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStores[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSendRecordViaCorrectSourceTopic[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSendRecordViaCorrectSourceTopic[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotCreateStateDirectoryForStatelessTopology[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotCreateStateDirectoryForStatelessTopology[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStoresNames[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldReturnAllStoresNames[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessConsumerRecordList[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessConsumerRecordList[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSinkSpecificSerializers[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUseSinkSpecificSerializers[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldFlushStoreForFirstInput[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldFlushStoreForFirstInput[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessFromSourceThatMatchPattern[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessFromSourceThatMatchPattern[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUpdateStoreForNewKey[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldUpdateStoreForNewKey[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateOnWallClockTime[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateOnWallClockTime[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSetRecordMetadata[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldSetRecordMetadata[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForLargerValue[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForLargerValue[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessRecordForTopic[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldProcessRecordForTopic[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldForwardRecordsFromSubtopologyToSubtopology[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldForwardRecordsFromSubtopologyToSubtopology[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForSmallerValue[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldNotUpdateStoreForSmallerValue[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCreateStateDirectoryForStatefulTopology[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldCreateStateDirectoryForStatefulTopology[Eos enabled = false] PASSED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateIfWallClockTimeAdvances[Eos enabled = false] STARTED

org.apache.kafka.streams.TopologyTestDriverTest > shouldPunctuateIfWallClockTimeAdvances[Eos enabled = false] PASSED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':core:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':streams:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.1.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 30m 5s
173 actionable tasks: 170 executed, 3 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
[FINDBUGS] Searching for all files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern **/build/reports/*bugs/*.xml
[FINDBUGS] Parsing 17 files in <https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/api/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/basic-auth-extension/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/file/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/json/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/runtime/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/transforms/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/examples/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/generator/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/jmh-benchmarks/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/log4j-appender/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/examples/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/streams-scala/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/test-utils/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file <https://builds.apache.org/job/kafka-trunk-jdk8/ws/tools/build/reports/spotbugs/main.xml> with 0 unique warnings and 0 duplicates.
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
No credentials specified
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=460e46c3bb76a361d0706b263c03696005e12566, workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #3444
Recording test results
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Not sending mail to unregistered user wangguoz@gmail.com
Not sending mail to unregistered user noreply@github.com