You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/10/30 00:43:18 UTC
Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #4151
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4151/display/redirect>
Changes:
------------------------------------------
[...truncated 563.36 KB...]
Oct 30, 2023 12:39:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-10-30T00:39:36.021Z: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Oct 30, 2023 12:39:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T00:39:36.105Z: Finished operation Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Oct 30, 2023 12:39:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T00:39:36.199Z: Cleaning up.
Oct 30, 2023 12:39:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T00:39:36.250Z: Stopping **** pool...
Oct 30, 2023 12:41:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T00:41:57.390Z: Autoscaling: Resized **** pool from 5 to 0.
Oct 30, 2023 12:41:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T00:41:57.419Z: Worker pool stopped.
Oct 30, 2023 12:42:04 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-10-29_17_34_21-10806110389812159611 finished with status DONE.
Oct 30, 2023 12:42:04 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
Oct 30, 2023 12:42:04 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Oct 30, 2023 12:42:04 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 30, 2023 12:42:04 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 287 files. Enable logging at DEBUG level to see which files will be staged.
Oct 30, 2023 12:42:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 30, 2023 12:42:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 287 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 30, 2023 12:42:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 0 seconds
Oct 30, 2023 12:42:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/
Oct 30, 2023 12:42:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <193309 bytes, hash b456678b5ecf5d6d529ffc3f911f262633769aab96137edbdadcb7b639fce4dd> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-tFZni17PXW1Sn_w_kR8mJjN2mquWE37b2ty3tjn85N0.pb
Oct 30, 2023 12:42:09 AM org.apache.beam.sdk.coders.SerializableCoder checkEqualsMethodDefined
WARNING: Can't verify serialized elements of type BoundedSource have well defined equals method. This may produce incorrect results on some PipelineRunner implementations
Oct 30, 2023 12:42:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 30, 2023 12:42:12 AM org.apache.kafka.common.config.AbstractConfig logAll
INFO: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [35.202.229.151:32401, 34.41.224.219:32402, 34.71.61.209:32403]
check.crcs = true
client.dns.lookup = default
client.id =
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 524288
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
Oct 30, 2023 12:42:12 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka version: 2.4.1
Oct 30, 2023 12:42:12 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka commitId: c57222ae8cd7866b
Oct 30, 2023 12:42:12 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka startTimeMs: 1698626532134
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest
org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.RuntimeException: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2166)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2161)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1227)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:198)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)
Caused by:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
1 test completed, 1 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 6,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 4,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030003317
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030003317
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7557123a2659c33d00cba924a8e627417ba82f538917a801a35195c033513828
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030003317
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030003317]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7557123a2659c33d00cba924a8e627417ba82f538917a801a35195c033513828]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030003317] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7557123a2659c33d00cba924a8e627417ba82f538917a801a35195c033513828])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:7557123a2659c33d00cba924a8e627417ba82f538917a801a35195c033513828
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 4,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[included builds,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 10m 15s
162 actionable tasks: 103 executed, 57 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/z5z7rdknyl4cu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #4156
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4156/display/redirect>
Changes:
------------------------------------------
[...truncated 541.46 KB...]
Nov 01, 2023 12:39:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-11-01T12:39:02.922Z: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Nov 01, 2023 12:39:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T12:39:03.009Z: Finished operation Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Nov 01, 2023 12:39:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T12:39:03.129Z: Cleaning up.
Nov 01, 2023 12:39:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T12:39:03.191Z: Stopping **** pool...
Nov 01, 2023 12:41:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T12:41:24.364Z: Autoscaling: Resized **** pool from 5 to 0.
Nov 01, 2023 12:41:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T12:41:24.421Z: Worker pool stopped.
Nov 01, 2023 12:41:31 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-11-01_05_33_53-9210226366787192734 finished with status DONE.
Nov 01, 2023 12:41:31 PM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
Nov 01, 2023 12:41:32 PM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Nov 01, 2023 12:41:32 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 01, 2023 12:41:32 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 287 files. Enable logging at DEBUG level to see which files will be staged.
Nov 01, 2023 12:41:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 01, 2023 12:41:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 287 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 01, 2023 12:41:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 0 seconds
Nov 01, 2023 12:41:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/
Nov 01, 2023 12:41:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <193298 bytes, hash e39a4239b2f27da0b669276a5cd8aaec1d3d65dd8fbb0f31769c11aaa53e5fe8> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-45pCObLyfaC2aSdqXNiq7B09Zd2Puw8xdpwRqqU-X-g.pb
Nov 01, 2023 12:41:36 PM org.apache.beam.sdk.coders.SerializableCoder checkEqualsMethodDefined
WARNING: Can't verify serialized elements of type BoundedSource have well defined equals method. This may produce incorrect results on some PipelineRunner implementations
Nov 01, 2023 12:41:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 01, 2023 12:41:39 PM org.apache.kafka.common.config.AbstractConfig logAll
INFO: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [34.41.211.121:32401, 35.193.180.160:32402, 35.239.102.178:32403]
check.crcs = true
client.dns.lookup = default
client.id =
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 524288
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
Nov 01, 2023 12:41:39 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka version: 2.4.1
Nov 01, 2023 12:41:39 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka commitId: c57222ae8cd7866b
Nov 01, 2023 12:41:39 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka startTimeMs: 1698842499565
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest
org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.RuntimeException: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2166)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2161)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1227)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:198)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)
Caused by:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
1 test completed, 1 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution ****,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101123255
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101123255
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9552479f6697a1d181ef5713cade41523d9e6e9099e175d1fa7abe96ca56beda
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101123255
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101123255]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9552479f6697a1d181ef5713cade41523d9e6e9099e175d1fa7abe96ca56beda]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101123255] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9552479f6697a1d181ef5713cade41523d9e6e9099e175d1fa7abe96ca56beda])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9552479f6697a1d181ef5713cade41523d9e6e9099e175d1fa7abe96ca56beda
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution ****,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[Execution ****,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 10m 6s
162 actionable tasks: 103 executed, 57 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/vk4f4l3jru7iu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #4155
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4155/display/redirect>
Changes:
------------------------------------------
[...truncated 533.87 KB...]
Nov 01, 2023 12:39:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-11-01T00:39:28.416Z: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Nov 01, 2023 12:39:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T00:39:28.496Z: Finished operation Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Nov 01, 2023 12:39:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T00:39:28.606Z: Cleaning up.
Nov 01, 2023 12:39:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T00:39:28.661Z: Stopping **** pool...
Nov 01, 2023 12:41:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T00:41:32.362Z: Autoscaling: Resized **** pool from 5 to 0.
Nov 01, 2023 12:41:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-11-01T00:41:32.390Z: Worker pool stopped.
Nov 01, 2023 12:41:38 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-10-31_17_34_18-2442913136585068574 finished with status DONE.
Nov 01, 2023 12:41:38 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
Nov 01, 2023 12:41:38 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Nov 01, 2023 12:41:38 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 01, 2023 12:41:39 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 287 files. Enable logging at DEBUG level to see which files will be staged.
Nov 01, 2023 12:41:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 01, 2023 12:41:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 287 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 01, 2023 12:41:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 0 seconds
Nov 01, 2023 12:41:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/
Nov 01, 2023 12:41:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <193294 bytes, hash 9f0d186c3d2d75562ae64187fac854e06e6cd865281e276d190a77e3408194cf> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-nw0YbD0tdVYq5kGH-shU4G5s2GUoHidtGQp340CBlM8.pb
Nov 01, 2023 12:41:43 AM org.apache.beam.sdk.coders.SerializableCoder checkEqualsMethodDefined
WARNING: Can't verify serialized elements of type BoundedSource have well defined equals method. This may produce incorrect results on some PipelineRunner implementations
Nov 01, 2023 12:41:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 01, 2023 12:41:46 AM org.apache.kafka.common.config.AbstractConfig logAll
INFO: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [35.222.12.167:32401, 35.226.244.42:32402, 34.41.97.205:32403]
check.crcs = true
client.dns.lookup = default
client.id =
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 524288
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
Nov 01, 2023 12:41:47 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka version: 2.4.1
Nov 01, 2023 12:41:47 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka commitId: c57222ae8cd7866b
Nov 01, 2023 12:41:47 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka startTimeMs: 1698799307015
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest
org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.RuntimeException: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2166)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2161)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1227)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:198)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)
Caused by:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
1 test completed, 1 failed
Finished generating test XML results (0.043 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.048 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 7,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 7,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101003312
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101003312
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:55785e7ca2120f8c0c6dae1f43d27ef85c5dfa888c97d52832e603122c6c1f30
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101003312
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101003312]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:55785e7ca2120f8c0c6dae1f43d27ef85c5dfa888c97d52832e603122c6c1f30]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231101003312] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:55785e7ca2120f8c0c6dae1f43d27ef85c5dfa888c97d52832e603122c6c1f30])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:55785e7ca2120f8c0c6dae1f43d27ef85c5dfa888c97d52832e603122c6c1f30
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 7,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[included builds,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 9m 58s
162 actionable tasks: 103 executed, 57 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/zechrupapqrwo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #4154
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4154/display/redirect>
Changes:
------------------------------------------
[...truncated 568.90 KB...]
Oct 31, 2023 12:38:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-10-31T12:38:20.306Z: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Oct 31, 2023 12:38:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T12:38:20.402Z: Finished operation Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Oct 31, 2023 12:38:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T12:38:20.515Z: Cleaning up.
Oct 31, 2023 12:38:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T12:38:20.578Z: Stopping **** pool...
Oct 31, 2023 12:40:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T12:40:42.499Z: Autoscaling: Resized **** pool from 5 to 0.
Oct 31, 2023 12:40:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T12:40:42.534Z: Worker pool stopped.
Oct 31, 2023 12:40:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-10-31_05_33_06-1896977631127096564 finished with status DONE.
Oct 31, 2023 12:40:48 PM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
Oct 31, 2023 12:40:48 PM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Oct 31, 2023 12:40:49 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 31, 2023 12:40:49 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 287 files. Enable logging at DEBUG level to see which files will be staged.
Oct 31, 2023 12:40:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 31, 2023 12:40:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 287 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 31, 2023 12:40:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 0 seconds
Oct 31, 2023 12:40:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/
Oct 31, 2023 12:40:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <193266 bytes, hash a3b04ef0301753422e3f4f05d55fe0420722fba0912b8cdfcbf5dd440fe2082a> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-o7BO8DAXU0IuP08F1V_gQgci-6CRK4zfy_XdRA_iCCo.pb
Oct 31, 2023 12:40:53 PM org.apache.beam.sdk.coders.SerializableCoder checkEqualsMethodDefined
WARNING: Can't verify serialized elements of type BoundedSource have well defined equals method. This may produce incorrect results on some PipelineRunner implementations
Oct 31, 2023 12:40:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 31, 2023 12:40:56 PM org.apache.kafka.common.config.AbstractConfig logAll
INFO: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [35.192.162.168:32401, 34.132.39.10:32402, 34.31.171.198:32403]
check.crcs = true
client.dns.lookup = default
client.id =
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 524288
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
Oct 31, 2023 12:40:56 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka version: 2.4.1
Oct 31, 2023 12:40:56 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka commitId: c57222ae8cd7866b
Oct 31, 2023 12:40:56 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka startTimeMs: 1698756056497
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest
org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.RuntimeException: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2166)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2161)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1227)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:198)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)
Caused by:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
1 test completed, 1 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 2,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 5,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031123205
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031123205
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6897c54f4cf0f9a57e93183f5b45d75c68f140bf54b2799fdfb54f05a7ee799f
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031123205
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031123205]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6897c54f4cf0f9a57e93183f5b45d75c68f140bf54b2799fdfb54f05a7ee799f]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031123205] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6897c54f4cf0f9a57e93183f5b45d75c68f140bf54b2799fdfb54f05a7ee799f])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:6897c54f4cf0f9a57e93183f5b45d75c68f140bf54b2799fdfb54f05a7ee799f
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 5,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 5,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 10m 14s
162 actionable tasks: 103 executed, 57 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/ldlfew2z3v5s2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #4153
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4153/display/redirect?page=changes>
Changes:
[noreply] [YAML] Add a basic aggregating transform to Beam Yaml. (#29167)
------------------------------------------
[...truncated 562.38 KB...]
Oct 31, 2023 12:39:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-10-31T00:39:09.787Z: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Oct 31, 2023 12:39:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T00:39:09.870Z: Finished operation Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Oct 31, 2023 12:39:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T00:39:10.002Z: Cleaning up.
Oct 31, 2023 12:39:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T00:39:10.067Z: Stopping **** pool...
Oct 31, 2023 12:41:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T00:41:34.936Z: Autoscaling: Resized **** pool from 5 to 0.
Oct 31, 2023 12:41:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-31T00:41:34.980Z: Worker pool stopped.
Oct 31, 2023 12:41:41 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-10-30_17_33_35-12410722604861764655 finished with status DONE.
Oct 31, 2023 12:41:41 AM org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903
Oct 31, 2023 12:41:41 AM org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler handleResponse
WARNING: Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Oct 31, 2023 12:41:41 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 31, 2023 12:41:42 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 287 files. Enable logging at DEBUG level to see which files will be staged.
Oct 31, 2023 12:41:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Oct 31, 2023 12:41:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 287 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Oct 31, 2023 12:41:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 0 seconds
Oct 31, 2023 12:41:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/
Oct 31, 2023 12:41:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <193291 bytes, hash 12745eecd267677e857511695f85a6d07cfe9dd53e8a493ad35fb9897353b4b4> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-EnRe7NJnZ36FdRFpX4Wm0Hz-ndU-ikk601-5iXNTtLQ.pb
Oct 31, 2023 12:41:46 AM org.apache.beam.sdk.coders.SerializableCoder checkEqualsMethodDefined
WARNING: Can't verify serialized elements of type BoundedSource have well defined equals method. This may produce incorrect results on some PipelineRunner implementations
Oct 31, 2023 12:41:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Oct 31, 2023 12:41:49 AM org.apache.kafka.common.config.AbstractConfig logAll
INFO: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [34.27.199.151:32401, 35.224.224.41:32402, 35.194.32.78:32403]
check.crcs = true
client.dns.lookup = default
client.id =
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 524288
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
Oct 31, 2023 12:41:49 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka version: 2.4.1
Oct 31, 2023 12:41:49 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka commitId: c57222ae8cd7866b
Oct 31, 2023 12:41:49 AM org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka startTimeMs: 1698712909348
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest
org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.RuntimeException: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2166)
at org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2161)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1227)
at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:198)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)
Caused by:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
1 test completed, 1 failed
Finished generating test XML results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[included builds,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 3,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031003224
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031003224
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ca393bf7b0e7e4ff2067ce610907ae3bc3042e735440072387a889fb105f4d4
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031003224
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031003224]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ca393bf7b0e7e4ff2067ce610907ae3bc3042e735440072387a889fb105f4d4]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231031003224] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ca393bf7b0e7e4ff2067ce610907ae3bc3042e735440072387a889fb105f4d4])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0ca393bf7b0e7e4ff2067ce610907ae3bc3042e735440072387a889fb105f4d4
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 3,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 3,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 10m 51s
162 actionable tasks: 103 executed, 57 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/m4tg2fosxzsjo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_PerformanceTests_Kafka_IO #4152
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4152/display/redirect>
Changes:
------------------------------------------
[...truncated 628.27 KB...]
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Worker ID: kafkaioit0testkafkaioread-10300533-p91i-harness-55rs,
Root cause: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Worker ID: kafkaioit0testkafkaioread-10300533-p91i-harness-55rs,
Root cause: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Worker ID: kafkaioit0testkafkaioread-10300533-p91i-harness-55rs,
Root cause: org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
at org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records (since last report)
at org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf not present in metadata after 60000 ms.
Worker ID: kafkaioit0testkafkaioread-10300533-p91i-harness-55rs
Oct 30, 2023 12:39:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T12:39:41.895Z: Cleaning up.
Oct 30, 2023 12:39:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T12:39:41.956Z: Stopping **** pool...
Oct 30, 2023 12:42:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T12:42:03.672Z: Autoscaling: Resized **** pool from 5 to 0.
Oct 30, 2023 12:42:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-30T12:42:03.715Z: Worker pool stopped.
Oct 30, 2023 12:42:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-10-30_05_33_11-11709220368883115438 failed with status FAILED.
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest FAILED
org.apache.beam.sdk.io.kafka.KafkaIOIT > testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.AssertionError: Values should be different. Actual: FAILED
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failEquals(Assert.java:187)
at org.junit.Assert.assertNotEquals(Assert.java:163)
at org.junit.Assert.assertNotEquals(Assert.java:177)
at org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:212)
1 test completed, 1 failed
Finished generating test XML results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 4,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 4,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030123207
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030123207
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7c6fbbb07b265129d268ccb0249bf234b87e6183ed834011949d1f2fdc1c642
Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030123207
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030123207]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7c6fbbb07b265129d268ccb0249bf234b87e6183ed834011949d1f2fdc1c642]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231030123207] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7c6fbbb07b265129d268ccb0249bf234b87e6183ed834011949d1f2fdc1c642])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7c6fbbb07b265129d268ccb0249bf234b87e6183ed834011949d1f2fdc1c642
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 4,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread 5,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
For more on this, please refer to https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation.
BUILD FAILED in 10m 23s
162 actionable tasks: 103 executed, 57 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/su5ntvmv7wnwk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org