You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/07/27 14:22:41 UTC

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17 #218

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/218/display/redirect?page=changes>

Changes:

[chamikaramj] Adds KV support for the Java RunInference transform.

[noreply] Replace distutils with supported modules. (#21968)

[noreply] Revert "Replace distutils with supported modules. " (#22453)

[noreply] Enable configuration to avoid successfully written Table Row propagation

[noreply] lint fixes for recent import (#22455)

[noreply] Bump Python Combine LoadTests timeout to 12 hours (#22439)

[noreply] convert windmill min timestamp to beam min timestamp (#21915)

[noreply] [CdapIO] Fixed necessary warnings (#22399)

[noreply] [#22051]: Add read_time support to Google Cloud Datastore connector


------------------------------------------
[...truncated 116.51 KB...]
60651846df32: Preparing
c6bad0095ade: Preparing
a89bad50fa10: Preparing
4be4bd4b2289: Preparing
758205d490e6: Preparing
3c5bcdabe3be: Preparing
a3ec96bccb8e: Preparing
ad56b0db5473: Preparing
71c84bc3e75f: Preparing
c65f819e2d1a: Preparing
25ecc7795fd2: Preparing
2150fe6876b2: Preparing
8d014c4e2d78: Preparing
36e825a8fd75: Preparing
b46f6a0f7ae5: Preparing
3bc383470c05: Preparing
e93827457889: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
a3ec96bccb8e: Waiting
b46f6a0f7ae5: Waiting
3c5bcdabe3be: Waiting
a037458de4e0: Waiting
36e825a8fd75: Waiting
2150fe6876b2: Waiting
ad56b0db5473: Waiting
bafdbe68e4ae: Waiting
8d014c4e2d78: Waiting
3bc383470c05: Waiting
71c84bc3e75f: Waiting
e93827457889: Waiting
25ecc7795fd2: Waiting
a13c519c6361: Waiting
c6bad0095ade: Pushed
758205d490e6: Pushed
a89bad50fa10: Pushed
4be4bd4b2289: Pushed
60651846df32: Pushed
ad56b0db5473: Pushed
a3ec96bccb8e: Pushed
c65f819e2d1a: Pushed
3c5bcdabe3be: Pushed
25ecc7795fd2: Pushed
71c84bc3e75f: Pushed
2150fe6876b2: Pushed
3bc383470c05: Layer already exists
e93827457889: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
08fa02ce37eb: Layer already exists
a13c519c6361: Layer already exists
36e825a8fd75: Pushed
b46f6a0f7ae5: Pushed
8d014c4e2d78: Pushed
20220727132020: digest: sha256:f1fa4e92bfdc4095fdfd8f85306ca1877473b563a74cae8e0292f9131bed8ab1 size: 4729

> Task :sdks:java:testing:load-tests:run
Jul 27, 2022 1:21:08 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 27, 2022 1:21:08 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged.
Jul 27, 2022 1:21:09 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jul 27, 2022 1:21:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Jul 27, 2022 1:21:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Jul 27, 2022 1:21:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 1 seconds
Jul 27, 2022 1:21:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jul 27, 2022 1:21:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <120328 bytes, hash 09ce25176c947ddbb511fdf8a3ce7b466ddc5e9fccc2168afb704b84749af952> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Cc4lF2yUfdu1Ef34o857Rm3cXp_MwhaK-3BLhHSa-VI.pb
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jul 27, 2022 1:21:15 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}]
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jul 27, 2022 1:21:15 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=100000}, SyntheticUnboundedSource{startOffset=100000, endOffset=200000}, SyntheticUnboundedSource{startOffset=200000, endOffset=300000}, SyntheticUnboundedSource{startOffset=300000, endOffset=400000}, SyntheticUnboundedSource{startOffset=400000, endOffset=500000}, SyntheticUnboundedSource{startOffset=500000, endOffset=600000}, SyntheticUnboundedSource{startOffset=600000, endOffset=700000}, SyntheticUnboundedSource{startOffset=700000, endOffset=800000}, SyntheticUnboundedSource{startOffset=800000, endOffset=900000}, SyntheticUnboundedSource{startOffset=900000, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=1100000}, SyntheticUnboundedSource{startOffset=1100000, endOffset=1200000}, SyntheticUnboundedSource{startOffset=1200000, endOffset=1300000}, SyntheticUnboundedSource{startOffset=1300000, endOffset=1400000}, SyntheticUnboundedSource{startOffset=1400000, endOffset=1500000}, SyntheticUnboundedSource{startOffset=1500000, endOffset=1600000}, SyntheticUnboundedSource{startOffset=1600000, endOffset=1700000}, SyntheticUnboundedSource{startOffset=1700000, endOffset=1800000}, SyntheticUnboundedSource{startOffset=1800000, endOffset=1900000}, SyntheticUnboundedSource{startOffset=1900000, endOffset=2000000}]
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-27_06_21_15-2684711923519211327?project=apache-beam-testing
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-27_06_21_15-2684711923519211327
Jul 27, 2022 1:21:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-07-27_06_21_15-2684711923519211327
Jul 27, 2022 1:21:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-07-27T13:21:21.111Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java170dataflow0v20streaming0cogbk02-jenkins-07-yhxc. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jul 27, 2022 1:21:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:25.559Z: Worker configuration: e2-standard-2 in us-central1-a.
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:26.592Z: Expanding SplittableParDo operations into optimizable parts.
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:26.688Z: Expanding CollectionToSingleton operations into optimizable parts.
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:26.777Z: Expanding CoGroupByKey operations into optimizable parts.
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:26.853Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:26.882Z: Expanding GroupByKey operations into streaming Read/Write steps
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:26.940Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.031Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.057Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.102Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.163Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.189Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.211Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.241Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.270Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.295Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.328Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.364Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.392Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.426Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.461Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.500Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.536Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.572Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.609Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.640Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.692Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.721Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.754Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.786Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:27.942Z: Running job using Streaming Engine
Jul 27, 2022 1:21:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:28.161Z: Starting 5 ****s in us-central1-a...
Jul 27, 2022 1:21:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:21:57.880Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 27, 2022 1:22:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:22:05.636Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Jul 27, 2022 1:23:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T13:23:08.326Z: Workers have started successfully.
Jul 27, 2022 2:21:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T14:21:47.679Z: Cleaning up.
Jul 27, 2022 2:21:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T14:21:47.807Z: Stopping **** pool...
Jul 27, 2022 2:21:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T14:21:47.863Z: Stopping **** pool...
Jul 27, 2022 2:22:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T14:22:29.362Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Jul 27, 2022 2:22:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-27T14:22:29.398Z: Worker pool stopped.
Jul 27, 2022 2:22:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-07-27_06_21_15-2684711923519211327 finished with status DONE.
Load test results for test (ID): 27af9eca-9fad-4fd8-b29b-4088a5f274da and timestamp: 2022-07-27T13:21:09.125000000Z:
                 Metric:                    Value:
dataflow_v2_java17_runtime_sec                  3450.793
dataflow_v2_java17_total_bytes_count                2.199996E9

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220727132020
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f1fa4e92bfdc4095fdfd8f85306ca1877473b563a74cae8e0292f9131bed8ab1
ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220727132020]

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 295

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 1h 2m 20s
105 actionable tasks: 8 executed, 97 up-to-date

Publishing build scan...
https://gradle.com/s/ocwqck3gxzedk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17 #219

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/219/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org