You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/01/13 13:18:45 UTC

Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #3550

See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3550/display/redirect?page=changes>

Changes:

[noreply] Bump google.golang.org/api from 0.106.0 to 0.107.0 in /sdks (#24989)


------------------------------------------
[...truncated 930.14 KB...]
    INFO: Kafka startTimeMs: 1673614407461
    Jan 13, 2023 12:53:27 PM org.apache.kafka.clients.Metadata update
    INFO: [Consumer clientId=consumer-1, groupId=null] Cluster ID: dcCdXNZJTjKU4cYOdIaGYA
    Jan 13, 2023 12:53:27 PM org.apache.beam.sdk.io.kafka.KafkaUnboundedSource split
    INFO: Partitions assigned to split 0 (total 1): beam-sdf-0
    Jan 13, 2023 12:53:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/StripIds as step s2
    Jan 13, 2023 12:53:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s3
    Jan 13, 2023 12:53:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s4
    Jan 13, 2023 12:53:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s5
    Jan 13, 2023 12:53:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.45.0-SNAPSHOT
    Jan 13, 2023 12:53:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-01-13_04_53_27-8873737331194354344?project=apache-beam-testing
    Jan 13, 2023 12:53:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2023-01-13_04_53_27-8873737331194354344
    Jan 13, 2023 12:53:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2023-01-13_04_53_27-8873737331194354344
    Jan 13, 2023 12:54:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2023-01-13T12:54:26.350Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-3ixf. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Jan 13, 2023 12:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:42.698Z: Worker configuration: e2-standard-2 in us-central1-a.
    Jan 13, 2023 12:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:43.956Z: Expanding SplittableParDo operations into optimizable parts.
    Jan 13, 2023 12:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.026Z: Expanding CollectionToSingleton operations into optimizable parts.
    Jan 13, 2023 12:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.099Z: Expanding CoGroupByKey operations into optimizable parts.
    Jan 13, 2023 12:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.137Z: Expanding SplittableProcessKeyed operations into optimizable parts.
    Jan 13, 2023 12:54:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.168Z: Expanding GroupByKey operations into streaming Read/Write steps
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.203Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.271Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.301Z: Fusing consumer Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor) into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Impulse
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.339Z: Fusing consumer Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/PairWithRestriction into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.370Z: Fusing consumer Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/SplitWithSizing into Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/PairWithRestriction
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.395Z: Fusing consumer Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous) into Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/ProcessElementAndRestrictionWithSizing
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.427Z: Fusing consumer Measure read time/ParMultiDo(TimeMonitor) into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous)
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.460Z: Fusing consumer Map records to strings/Map/ParMultiDo(Anonymous) into Measure read time/ParMultiDo(TimeMonitor)
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.485Z: Fusing consumer Counting element/ParMultiDo(Counting) into Map records to strings/Map/ParMultiDo(Anonymous)
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:44.589Z: Running job using Streaming Engine
    Jan 13, 2023 12:54:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:45.853Z: Starting 5 ****s in us-central1-a...
    Jan 13, 2023 12:54:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:54:56.734Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jan 13, 2023 12:55:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:55:28.654Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Jan 13, 2023 12:55:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:55:28.674Z: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
    Jan 13, 2023 12:55:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:55:38.106Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
    Jan 13, 2023 12:56:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:56:38.229Z: Workers have started successfully.
    Jan 13, 2023 12:56:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-01-13T12:56:38.813Z: All ****s have finished the startup processes and began to receive work requests.
    Jan 13, 2023 1:18:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State value RUNNING
    Jan 13, 2023 1:18:36 PM org.apache.kafka.common.config.AbstractConfig logAll
    INFO: AdminClientConfig values: 
    	bootstrap.servers = [34.27.191.128:32404, 35.192.133.242:32405, 34.66.70.226:32406]
    	client.dns.lookup = default
    	client.id = 
    	connections.max.idle.ms = 300000
    	metadata.max.age.ms = 300000
    	metric.reporters = []
    	metrics.num.samples = 2
    	metrics.recording.level = INFO
    	metrics.sample.window.ms = 30000
    	receive.buffer.bytes = 65536
    	reconnect.backoff.max.ms = 1000
    	reconnect.backoff.ms = 50
    	request.timeout.ms = 120000
    	retries = 5
    	retry.backoff.ms = 100
    	sasl.client.callback.handler.class = null
    	sasl.jaas.config = null
    	sasl.kerberos.kinit.cmd = /usr/bin/kinit
    	sasl.kerberos.min.time.before.relogin = 60000
    	sasl.kerberos.service.name = null
    	sasl.kerberos.ticket.renew.jitter = 0.05
    	sasl.kerberos.ticket.renew.window.factor = 0.8
    	sasl.login.callback.handler.class = null
    	sasl.login.class = null
    	sasl.login.refresh.buffer.seconds = 300
    	sasl.login.refresh.min.period.seconds = 60
    	sasl.login.refresh.window.factor = 0.8
    	sasl.login.refresh.window.jitter = 0.05
    	sasl.mechanism = GSSAPI
    	security.protocol = PLAINTEXT
    	security.providers = null
    	send.buffer.bytes = 131072
    	ssl.cipher.suites = null
    	ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    	ssl.endpoint.identification.algorithm = https
    	ssl.key.password = null
    	ssl.keymanager.algorithm = SunX509
    	ssl.keystore.location = null
    	ssl.keystore.password = null
    	ssl.keystore.type = JKS
    	ssl.protocol = TLS
    	ssl.provider = null
    	ssl.secure.random.implementation = null
    	ssl.trustmanager.algorithm = PKIX
    	ssl.truststore.location = null
    	ssl.truststore.password = null
    	ssl.truststore.type = JKS

    Jan 13, 2023 1:18:36 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka version: 2.4.1
    Jan 13, 2023 1:18:36 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka commitId: c57222ae8cd7866b
    Jan 13, 2023 1:18:36 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka startTimeMs: 1673615916843

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
    java.lang.AssertionError: actual number of records 25228927 smaller than expected: 100000000.
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.assertTrue(Assert.java:42)
        at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:225)

1 test completed, 1 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** Thread 3,5,main]) completed. Took 45 mins 31.905 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
  Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230113123211
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230113123211
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d09b0737221cb178174cfd158419381eeb153b775c60f5d269ffef405414e55f
Deleted: sha256:a6f3d3bf987507efff6e3e31e733d6a43df0080b5484d20e2ce8912520972e76
Deleted: sha256:0f945eed3ea18db4cd67c7e33173d9d73785e38a1aa22dfe196949c7faeabfbf
Deleted: sha256:d95cb1752a3fefab2eef6f744486685537601aeb5daf89d7d8b46e70e0676b08
Deleted: sha256:2a7da575df61264471f87504dfe6e214f8b590e98945ebb2bcd4dc534adb4de7
Deleted: sha256:ebc0e0c360f2c96ee3d8727d2ba9b28690a210bae30b96fc4dccb151de009331
Deleted: sha256:63be7cda58b4e34cafb84d4f26987820cbc9ba1fbf14c59a34d962e1d876a6e4
Deleted: sha256:c4517f4450a9479266c539840f154a316e5dfd5a1a432ee67a2cfb0ebc87cdf5
Deleted: sha256:9386db5af03c6dac95a9f6049a7c7b2f92e10982445d2151bd80204a77cdbc9d
Deleted: sha256:a2feed4ec3f45c31ee5f5facb4b6fc67bf382e0e335fa7f34ff5e23b129aa9ce
Deleted: sha256:fd19c369d0a401d7d8ffae9830f451326914e3d284befb37377ee06e30298257
Deleted: sha256:cd14d5ce06c651d2251fda31c53ced16eb9a406f93b6d144630570a56e748459
Deleted: sha256:3922eb0acd3a509d1d1b1893a95b4d9842dcf94fbf657cc2c76b3175e245aeae
Deleted: sha256:91a135e4912be22afc0e1897effeb287f9458acd49b830394e35486ce9df4aa0
Deleted: sha256:94691d5b156ee75051a4b0993b337b8b8726e5a80ea258a6ac49ed00330db10e
Deleted: sha256:b55e0d9b979c126386998f16f1bab7b0fa20296b8417ae48b1f7935ad7564f4d
Deleted: sha256:cd93366eb17da58e589f50865b83dd092c6aab274a37eed6fdb38c3e2ef52e71
Deleted: sha256:b3527e0791a1d19d207a441174801b25693672837fba15b51ff4ba16e7009749
Deleted: sha256:ecb60555251726774f0560b13441c16903b94b773f3f4d06e2ad594b20c229ba
Deleted: sha256:c53ae6ee1cf6f0d9d8e70173f0db26980f466349b4ed66b4a6b27ef4e3841e13
Deleted: sha256:18e44fdbe02c8ffe282ae6d86d21c818c4a60a2514723e6a4e042aa7afb3985b
Deleted: sha256:00ece275e1cd041d2fd4097b293ed9627256b435d48f9192554caac0078aced5
Deleted: sha256:0d9c50d4d0b732004c810c0518baf6714430251ecc7ffcba2b9b4a5241f6a754
Deleted: sha256:32986728540620078d1638a2deb61ec57784b16239222c1d882d8bb20b5a66a3
Deleted: sha256:cb660df928af983c8fafcd78585dea4fb7f5b3d8abc69813b2d25db3b3ee634b
Deleted: sha256:b5e678ae1d93ae03d68c444befa9d1b8d63cbd2b8758acfb88d552ee4f436b91
Deleted: sha256:70352fe55f16a6d9c004347647fb8c9961e8f82b674a2d1657651011d88daad6
Deleted: sha256:de9f5c3c20af219338553ad5d4f88c01a9e1fb8334a8986afa0f66a244c44e72
Deleted: sha256:4f59bc461cf8e01624db9b7be84a1362216058dc1759f50c2d5c94c833a08916
Deleted: sha256:ee3de70dadd555d7000ab3e09a546a4129fa45407251b5af79a4b2c8918e96e8
Deleted: sha256:b69a859338a0d782c854c0e3ff99d1073205378f7649a0740330c3ece30a9dc9
Deleted: sha256:dee0b8365b91968b62565edbedc118ee78aab86cbf444ef759262ebff5794eea
Deleted: sha256:c72d577d6226c2324f025e0131e6d344ba500e264dfe1dc32a191217f385c8c1
Deleted: sha256:cbca76fed899272d55f20af17f6431ee57a200da588b63cb9ef967130ef38a2d
Deleted: sha256:7c74d44d337e5fb6054689858fa0df8ad17344e2a95d5cca211e1bb2a8f38d1b
Deleted: sha256:367004e7364ed33b589f0eda5c51439c97afea91e14c6c41cfcb7b03ddcd8239
Deleted: sha256:9f9c4a60c88fe55e1d8f71bbeb52635f50dfd1e66c77d7534935b533a5b7e7f1
Deleted: sha256:6d5e25f21ee3756717c5871da53861548a37706c435db9d2a99fd4120c749b51
Deleted: sha256:723316496359f1bf8bc298e179e1c4f8ee22d628b7a5d7b19c32eb35365b8eff
Deleted: sha256:fae3107e1ed869013ae97b3cd6ba3007d04a26bbb2c32171e93e258083609e4e
Deleted: sha256:a724e957876171bffe25010197947d74cde30092d3454870d740da64bf371092
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230113123211
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230113123211]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d09b0737221cb178174cfd158419381eeb153b775c60f5d269ffef405414e55f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230113123211] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d09b0737221cb178174cfd158419381eeb153b775c60f5d269ffef405414e55f])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d09b0737221cb178174cfd158419381eeb153b775c60f5d269ffef405414e55f
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d09b0737221cb178174cfd158419381eeb153b775c60f5d269ffef405414e55f
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d09b0737221cb178174cfd158419381eeb153b775c60f5d269ffef405414e55f].
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) completed. Took 5.204 secs.
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:kafka:cleanUp (Thread[included builds,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[included builds,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 46m 46s
157 actionable tasks: 98 executed, 55 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/umgck3d3ltfzu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Kafka_IO #3551

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3551/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org