You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/11/05 13:17:18 UTC

Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17 #319

See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/319/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Move logging to its own module.

[Robert Bradshaw] Cleanup worker logging.

[Robert Bradshaw] Add basic counter setting and getting to the typescript SDK.

[Robert Bradshaw] Support metrics over the portability API.

[Robert Bradshaw] Add distribution metric type.

[Robert Bradshaw] old prettier change

[noreply] TFX image classification example (#23456)

[noreply] Immediately truncate full restriction on drain of periodic impulse

[noreply] [Task]: PR Bot will push commits only if they are non-empty (#23937)

[noreply] Bump cloud.google.com/go/datastore from 1.8.0 to 1.9.0 in /sdks (#23916)

[Robert Bradshaw] Remove obsolete TODO.

[Robert Bradshaw] Only report counters that were actually used.

[noreply] Add custom inference fn suport to the sklearn model handlers (#23642)

[noreply] removed trailing whitespace (#23987)

[noreply] Beam starter projects blog post (#23964)

[noreply] Enable more portable-runner requiring tests. (#23970)

[noreply] Website add and update logos (#23899)


------------------------------------------
[...truncated 52.99 KB...]
a1aaff498b9b: Preparing
3bc383470c05: Preparing
e93827457889: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
118910dd6ef4: Waiting
14bbd92cdf65: Waiting
fa3b7fde4a9a: Waiting
bafdbe68e4ae: Waiting
a13c519c6361: Waiting
08fa02ce37eb: Waiting
04986b633f73: Waiting
5a6c6e2170d0: Waiting
2116c4a56018: Waiting
a037458de4e0: Waiting
e93827457889: Waiting
a1aaff498b9b: Waiting
1b951d4ec0d8: Waiting
3bc383470c05: Waiting
1175c42f627e: Waiting
675b48f1d3c5: Waiting
a3b02487a7f4: Pushed
2ab6d67b49d8: Pushed
cd801e360340: Pushed
cf7f6a1c0f6b: Pushed
ef81e4d1b652: Pushed
1b951d4ec0d8: Pushed
5a6c6e2170d0: Pushed
14bbd92cdf65: Pushed
fa3b7fde4a9a: Pushed
118910dd6ef4: Pushed
675b48f1d3c5: Pushed
3bc383470c05: Layer already exists
e93827457889: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
2116c4a56018: Pushed
a1aaff498b9b: Pushed
1175c42f627e: Pushed
04986b633f73: Pushed
20221105123045: digest: sha256:1a91fe8f364996ae80b2b6499f3289b4f5ad86045d21edc141d6f8a0f921940d size: 4729

> Task :sdks:java:testing:load-tests:run
Nov 05, 2022 12:32:34 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Nov 05, 2022 12:32:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 229 files. Enable logging at DEBUG level to see which files will be staged.
Nov 05, 2022 12:32:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Nov 05, 2022 12:32:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
Nov 05, 2022 12:32:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 229 files from PipelineOptions.filesToStage to staging location to prepare for execution.
Nov 05, 2022 12:32:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 229 files cached, 0 files newly uploaded in 0 seconds
Nov 05, 2022 12:32:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Nov 05, 2022 12:32:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <122670 bytes, hash 53d9a674979d87a06760a5a45c81b23907992634999a1143fa2306206e2641be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-U9mmdJedh6BnYKWkXIGyOQeZJjSZmhFD-iMGIG4mQb4.pb
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Nov 05, 2022 12:32:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}]
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Nov 05, 2022 12:32:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=100000}, SyntheticUnboundedSource{startOffset=100000, endOffset=200000}, SyntheticUnboundedSource{startOffset=200000, endOffset=300000}, SyntheticUnboundedSource{startOffset=300000, endOffset=400000}, SyntheticUnboundedSource{startOffset=400000, endOffset=500000}, SyntheticUnboundedSource{startOffset=500000, endOffset=600000}, SyntheticUnboundedSource{startOffset=600000, endOffset=700000}, SyntheticUnboundedSource{startOffset=700000, endOffset=800000}, SyntheticUnboundedSource{startOffset=800000, endOffset=900000}, SyntheticUnboundedSource{startOffset=900000, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=1100000}, SyntheticUnboundedSource{startOffset=1100000, endOffset=1200000}, SyntheticUnboundedSource{startOffset=1200000, endOffset=1300000}, SyntheticUnboundedSource{startOffset=1300000, endOffset=1400000}, SyntheticUnboundedSource{startOffset=1400000, endOffset=1500000}, SyntheticUnboundedSource{startOffset=1500000, endOffset=1600000}, SyntheticUnboundedSource{startOffset=1600000, endOffset=1700000}, SyntheticUnboundedSource{startOffset=1700000, endOffset=1800000}, SyntheticUnboundedSource{startOffset=1800000, endOffset=1900000}, SyntheticUnboundedSource{startOffset=1900000, endOffset=2000000}]
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Nov 05, 2022 12:32:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.44.0-SNAPSHOT
Nov 05, 2022 12:32:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-05_05_32_41-12892909955506723349?project=apache-beam-testing
Nov 05, 2022 12:32:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-11-05_05_32_41-12892909955506723349
Nov 05, 2022 12:32:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2022-11-05_05_32_41-12892909955506723349
Nov 05, 2022 12:32:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-11-05T12:32:46.925Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java170dataflow0v20streaming0cogbk01-jenkins-11-hvxu. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Nov 05, 2022 12:32:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:54.006Z: Worker configuration: e2-standard-2 in us-central1-b.
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.247Z: Expanding SplittableParDo operations into optimizable parts.
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.286Z: Expanding CollectionToSingleton operations into optimizable parts.
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.365Z: Expanding CoGroupByKey operations into optimizable parts.
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.443Z: Expanding SplittableProcessKeyed operations into optimizable parts.
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.479Z: Expanding GroupByKey operations into streaming Read/Write steps
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.543Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.658Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.693Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.726Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.761Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.797Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.832Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.864Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.887Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.909Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.941Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:55.980Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.005Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.040Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.074Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.104Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.139Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.170Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.205Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.239Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.273Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.307Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.344Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.376Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.556Z: Running job using Streaming Engine
Nov 05, 2022 12:32:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:32:56.796Z: Starting 5 ****s in us-central1-b...
Nov 05, 2022 12:33:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:33:12.658Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 05, 2022 12:35:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:35:09.393Z: Worker configuration: e2-standard-2 in us-central1-b.
Nov 05, 2022 12:35:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:35:13.266Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Nov 05, 2022 12:35:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:35:14.792Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Nov 05, 2022 12:35:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T12:35:33.230Z: Workers have started successfully.
Nov 05, 2022 1:14:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T13:14:32.380Z: Cleaning up.
Nov 05, 2022 1:14:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T13:14:32.512Z: Stopping **** pool...
Nov 05, 2022 1:14:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T13:14:32.574Z: Stopping **** pool...
Nov 05, 2022 1:17:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T13:17:02.490Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
Nov 05, 2022 1:17:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-11-05T13:17:02.579Z: Worker pool stopped.
Nov 05, 2022 1:17:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-11-05_05_32_41-12892909955506723349 finished with status DONE.
Load test results for test (ID): bd3538df-82fe-4f67-8ff8-de5a264d8c43 and timestamp: 2022-11-05T12:32:35.802000000Z:
                 Metric:                    Value:
dataflow_v2_java17_runtime_sec                  2141.831
dataflow_v2_java17_total_bytes_count                2.199996E9

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221105123045
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1a91fe8f364996ae80b2b6499f3289b4f5ad86045d21edc141d6f8a0f921940d
WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221105123045]
- referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1a91fe8f364996ae80b2b6499f3289b4f5ad86045d21edc141d6f8a0f921940d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221105123045] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1a91fe8f364996ae80b2b6499f3289b4f5ad86045d21edc141d6f8a0f921940d])].
Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f3b39277944faadd94fb0ad509b47d4f04ab92b69fbf7c689e828d079076661
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2f3b39277944faadd94fb0ad509b47d4f04ab92b69fbf7c689e828d079076661
ERROR: (gcloud.container.images.delete) Not found: response: {'docker-distribution-api-version': 'registry/2.0', 'content-type': 'application/json', 'content-encoding': 'gzip', 'date': 'Sat, 05 Nov 2022 13:17:15 GMT', 'server': 'Docker Registry', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': 404}
Failed to compute blob liveness for manifest: 'sha256:2f3b39277944faadd94fb0ad509b47d4f04ab92b69fbf7c689e828d079076661': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 304

* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 46m 46s
113 actionable tasks: 75 executed, 34 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/npn325u6fbzzg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17 #320

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/320/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org