You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/11/25 18:56:24 UTC

Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #3371

See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3371/display/redirect>

Changes:


------------------------------------------
[...truncated 893.83 KB...]
    	isolation.level = read_uncommitted
    	key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
    	max.partition.fetch.bytes = 1048576
    	max.poll.interval.ms = 300000
    	max.poll.records = 500
    	metadata.max.age.ms = 300000
    	metric.reporters = []
    	metrics.num.samples = 2
    	metrics.recording.level = INFO
    	metrics.sample.window.ms = 30000
    	partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
    	receive.buffer.bytes = 524288
    	reconnect.backoff.max.ms = 1000
    	reconnect.backoff.ms = 50
    	request.timeout.ms = 30000
    	retry.backoff.ms = 100
    	sasl.client.callback.handler.class = null
    	sasl.jaas.config = null
    	sasl.kerberos.kinit.cmd = /usr/bin/kinit
    	sasl.kerberos.min.time.before.relogin = 60000
    	sasl.kerberos.service.name = null
    	sasl.kerberos.ticket.renew.jitter = 0.05
    	sasl.kerberos.ticket.renew.window.factor = 0.8
    	sasl.login.callback.handler.class = null
    	sasl.login.class = null
    	sasl.login.refresh.buffer.seconds = 300
    	sasl.login.refresh.min.period.seconds = 60
    	sasl.login.refresh.window.factor = 0.8
    	sasl.login.refresh.window.jitter = 0.05
    	sasl.mechanism = GSSAPI
    	security.protocol = PLAINTEXT
    	security.providers = null
    	send.buffer.bytes = 131072
    	session.timeout.ms = 10000
    	ssl.cipher.suites = null
    	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    	ssl.endpoint.identification.algorithm = https
    	ssl.key.password = null
    	ssl.keymanager.algorithm = SunX509
    	ssl.keystore.location = null
    	ssl.keystore.password = null
    	ssl.keystore.type = JKS
    	ssl.protocol = TLS
    	ssl.provider = null
    	ssl.secure.random.implementation = null
    	ssl.trustmanager.algorithm = PKIX
    	ssl.truststore.location = null
    	ssl.truststore.password = null
    	ssl.truststore.type = JKS
    	value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

    Nov 25, 2022 6:41:15 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka version: 2.4.1
    Nov 25, 2022 6:41:15 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka commitId: c57222ae8cd7866b
    Nov 25, 2022 6:41:15 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka startTimeMs: 1669401675614
    Nov 25, 2022 6:41:15 PM org.apache.kafka.clients.Metadata update
    INFO: [Consumer clientId=consumer-1, groupId=null] Cluster ID: TnYKezaYQciHKOIjiAL2Ww
    Nov 25, 2022 6:41:15 PM org.apache.beam.sdk.io.kafka.KafkaUnboundedSource split
    INFO: Partitions assigned to split 0 (total 1): beam-sdf-0
    Nov 25, 2022 6:41:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/StripIds as step s2
    Nov 25, 2022 6:41:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s3
    Nov 25, 2022 6:41:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s4
    Nov 25, 2022 6:41:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s5
    Nov 25, 2022 6:41:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.44.0-SNAPSHOT
    Nov 25, 2022 6:41:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-25_10_41_16-5229872133068499310?project=apache-beam-testing
    Nov 25, 2022 6:41:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2022-11-25_10_41_16-5229872133068499310
    Nov 25, 2022 6:41:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-11-25_10_41_16-5229872133068499310
    Nov 25, 2022 6:41:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2022-11-25T18:41:21.898Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-z0w6. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Nov 25, 2022 6:41:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:36.126Z: Worker configuration: e2-standard-2 in us-central1-a.
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.481Z: Expanding SplittableParDo operations into optimizable parts.
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.524Z: Expanding CollectionToSingleton operations into optimizable parts.
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.586Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.628Z: Expanding SplittableProcessKeyed operations into optimizable parts.
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.652Z: Expanding GroupByKey operations into streaming Read/Write steps
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.686Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.768Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.793Z: Fusing consumer Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor) into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Impulse
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.820Z: Fusing consumer Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/PairWithRestriction into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.848Z: Fusing consumer Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/SplitWithSizing into Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/PairWithRestriction
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.874Z: Fusing consumer Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous) into Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/ProcessElementAndRestrictionWithSizing
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.907Z: Fusing consumer Measure read time/ParMultiDo(TimeMonitor) into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous)
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.936Z: Fusing consumer Map records to strings/Map/ParMultiDo(Anonymous) into Measure read time/ParMultiDo(TimeMonitor)
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:37.963Z: Fusing consumer Counting element/ParMultiDo(Counting) into Map records to strings/Map/ParMultiDo(Anonymous)
    Nov 25, 2022 6:41:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:38.060Z: Running job using Streaming Engine
    Nov 25, 2022 6:41:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:39.313Z: Starting 5 ****s in us-central1-a...
    Nov 25, 2022 6:41:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:41:57.212Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 25, 2022 6:42:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:42:23.094Z: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 25, 2022 6:42:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:42:23.124Z: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
    Nov 25, 2022 6:42:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:42:33.593Z: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 25, 2022 6:42:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:42:33.617Z: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
    Nov 25, 2022 6:42:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:42:43.975Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Nov 25, 2022 6:43:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:43:36.547Z: All ****s have finished the startup processes and began to receive work requests.
    Nov 25, 2022 6:43:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-11-25T18:43:39.154Z: Workers have started successfully.
    Nov 25, 2022 6:56:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State value RUNNING

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
    java.lang.AssertionError: expected:<100000> but was:<200000>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:207)

1 test completed, 1 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** Thread 6,5,main]) completed. Took 19 mins 52.333 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
  Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221125183446
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221125183446
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c51f2305407f86bec26ab9e409545b64799a9c072ef76619d1d61b7f5bac0eed
Deleted: sha256:88d03ce022235c60e15c94a012d723dcf4bca21097acc15101f5cf59ad5e429a
Deleted: sha256:bdc731725b7ddb42ba4d5d014d8d8dbe9d1860b31ed09a5c5e27180da38cc69e
Deleted: sha256:b56401afc2e2a9e51c50c83d41de46367b21ac5037b0d5b30fa2efcef0c44af8
Deleted: sha256:4197c52376f53ee199e9f2ac45f432d1e0f5cf9fd0cc9f070db9c85eda81a5bf
Deleted: sha256:fd1bd9428f1122668fb826a170dd6f2afbe4ce0fba1540b70fa1a28c24a09bfc
Deleted: sha256:20aa13ac0c70a2fd97f48a47013b262695490d5007a2490584e2e71fd6378418
Deleted: sha256:8668ac1d3ef4e1fc2416bc9400340a1a55de298f436d67d9cea40e5fa3cdd5ad
Deleted: sha256:8a86ea00270e5e01009be825d862ed73e013008dc5f5b69f54e50b33568804a3
Deleted: sha256:eb27cc3d5fa97e1ff17c1ee176602d1d20f24601b40160391399acefd79e2456
Deleted: sha256:40b01cd6505fd07e54aa40a955ac2ea89f22eb4d0e58792ceb8433ba2f3227ef
Deleted: sha256:8f034fa3b03435f32894373ad00b99a1f21294d54d03e725da279886dd8e0347
Deleted: sha256:6e0d0949e6253d2ddb0009c3107b9d67ac0b41ea5bdc8b596f342184f82117a5
Deleted: sha256:f85225126209c91e84016f7e57a576c80bc10656f1416bc19a2adf9166b009ed
Deleted: sha256:1a46ba4596978bd75150043df3e22040776ba7769326887502d9e951c1f3da3e
Deleted: sha256:fa9159203161da5f63924dedb3990a7a2f26a50904b459143bd189cd4f0036f7
Deleted: sha256:d76b948efd34a99fb042b879cad628864caeb13db332a3a3561a1e4013da48a7
Deleted: sha256:719aedb5fe9a00b080d886164af636d095aa33099c3f4e9d182e94ffd14e3b12
Deleted: sha256:ea9876ab562243c511d9ec63b7cd63f59516890969f97a5afa9283db11cc0bea
Deleted: sha256:0da72ea73778cc5071f9153dc95dc5d167de78d646ac0bf1bb081bc16e8fed6c
Deleted: sha256:cbaaf2634ca16ab1fc3364b04585fa6dd9877edf13411c9fa2e133fd80454c9b
Deleted: sha256:afff3a508525e6986384c289d5ff251cc6f8d7867490cabd31e6be85c2069d96
Deleted: sha256:d0e2669695383c81cae649f61d5455d76c0e9f7766ae7255b89540295ead0478
Deleted: sha256:ece563cd6967f860376c5fbe5c56e1bf65f2a96ddb0136e68eb8f577859e13a0
Deleted: sha256:964427ef889f7a7301f125a08a8ad30ae8039c4bb2c49a80ba59ed108e2be36d
Deleted: sha256:35a93364d0c69ca6d1c297d4d6c844fd76adfce43ec00beb40799966db18699b
Deleted: sha256:e3fdd91430f7aa6727c57373df5840472cd88fee72a15a1391992682bd617353
Deleted: sha256:3d2cb74ba67ca360351ad788d227c906e108da1bd4e5a7881feb38b7549c0ab9
Deleted: sha256:762016ed6baf81f1208138fcf3fb1f19696b74a1d82840dba29c0707253afdde
Deleted: sha256:e793383ed9e360fd105e08698c4733c1712e15e7288507ff35b3a7967a471d6a
Deleted: sha256:f103470dc128dfa342c6ee9ca06379e657fd983102f3d5a6032549f994c5344e
Deleted: sha256:29a5fef280afcec7eff3400781f6b7622d8647053e0f9bda121f0276c5e2374f
Deleted: sha256:0afa9513b5aa0324a3667e9f4245484a2ec746271e01268fb4fa839fb2d9f3f1
Deleted: sha256:3fdd54a893f216bf4132021f03509e8776294efcb837e2632d308598d65097d8
Deleted: sha256:950888fbe40e005a13a105a5fa688bec0bcec89ae5cc626b1c237b3678d63f38
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221125183446
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221125183446]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c51f2305407f86bec26ab9e409545b64799a9c072ef76619d1d61b7f5bac0eed]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221125183446] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c51f2305407f86bec26ab9e409545b64799a9c072ef76619d1d61b7f5bac0eed])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c51f2305407f86bec26ab9e409545b64799a9c072ef76619d1d61b7f5bac0eed
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c51f2305407f86bec26ab9e409545b64799a9c072ef76619d1d61b7f5bac0eed
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c51f2305407f86bec26ab9e409545b64799a9c072ef76619d1d61b7f5bac0eed].
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) completed. Took 4.537 secs.
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 7,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 21m 48s
156 actionable tasks: 97 executed, 55 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/uetcnigcecsfq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Kafka_IO #3372

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3372/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org